[Open Hours] Teaching and Researching with the Audio-Visual Essay | Catherine Fowler

Join us on Friday 22nd February for an Open Hours discussion between 12 noon and 1pm at the Digital Humanities Hub!

Topic – Teaching and Researching with the Audio-Visual essay

Our guest this week is Catherine Fowler from the Media, Film and Communication programme. She will consider the promise and challenge of audio-visual essays as a research, publishing, and teaching tool.

Over roughly the past 5 years Film and Media scholarship has been animated by the idea of the Audio-Visual essay. Increasingly the argument is made that media studies has always been held back by the inability to respond to our objects of study in their language; rather, when we write about the audio-visual we are always inevitably translating, and something is lost in translation. With the DVD came the opportunity to undertake videographic analysis: by stopping, slowing and repeating images students and researchers have been able to interrogate stylistic features and analyse how the art and politics of films come about. Thanks to the increasing availability of digital editing tools videographic analysis has entered another level.

The audio-visual essay has been born as both a form of assessment for students and research for scholars. The introduction of audio-visual essays raises questions familiar to all those working with digital tools: how can they be ‘counted’ as research? How can they be used to create new knowledge? And how can we ensure that audio-visual essays improve and support students’ writing tasks and scholarship rather than putting both in peril?

Reading & Viewing

Dead Time – Catherine Fowler, Claire Perkins, and Andrea Rassell

Explore both the essay and the creators’ statement in this audio-visual essay published in the [in]Transition journal. Co-creators Fowler, Perkins and Rassell seek on one hand “to respond to debates about whether media scholars can discover anything new by using eye tracking methods”, and on the other “to contribute to discussions as to the balance between the expository and the poetic in audiovisual essays….”

Unseen Screens: Eye Tracking, Magic and Misdirection – Tessa Dwyer and Jenny Robinson

Another piece from [in]Transition which gives some useful background to what research on eye tracking screens typically ‘looks like’.

WHEN: 12pm – 1pm, Friday 22nd February 2019

WHERE: Digital Humanities Hub, Room 1W3, First Floor, Arts Building

WHO: Anyone in the University community – there’s no advance registration required, but we always appreciate knowing in advance if you are planning to come along!

CONTACT: Alexander Ritchie alexander.ritchie@otago.ac.nz

[Open Hours] Transcription, text recognition & cultural heritage computing

Join us on Friday 15th February for a one hour discussion between 12 noon and 1pm at the Digital Humanities Hub for this week’s Open Hours!

Topic – Transcription, text recognition & cultural heritage computing

Dr Steven Mills is a Senior Lecturer in the Department of Computer Science. His research is in computer vision, using computers to extract useful information from images and videos. He has a particular interest in cultural and heritage applications, including collaborations with archaeologists, archivists, and artists. He will present the results of preliminary work using deep neural networks to recognise letters and words in handwritten documents from the Marsden Online Archive. He will also attempt to explain what deep neural networks are, apart from “very mysterious”.

Lynn Benson is the Researcher Services Manager for Hocken Collections. She will explore some international initiatives and developments that are possible paths for the Library to follow in our goals to improve delivery of our digitised and born-digital collections.

Reading – Here’s How Google Deep Dream Generates Those Trippy Images | Madison Margolin

An introduction to the images produced by Google’s Deep Dream computer vision platform with an excellent video explanation by Dr Mike Pound.

You can even try generating your own images with one of the online Deep Dream generators:  https://deepdreamgenerator.com/

Image of the Otago University Clocktower processed and filtered by the Google's Deep Dream software
A Deep Dream of the Otago University Clocktower

Projects – READ, Visualize the Public Domain, Venice Time Machine, Arabic Scientific Manuscripts, Gravitron

WHEN: 12pm – 1pm, Friday 15th Feburary 2019

WHERE: Digital Humanities Hub, Room 1W3, First Floor, Arts Building

WHO: Anyone in the University community – there’s no advance registration required, but we always appreciate knowing in advance if you are planning to come along!

CONTACT: Alexander Ritchie alexander.ritchie@otago.ac.nz

Te Pōkapu Matihiko o Te Kete Aronui | the Digital Humanities Hub: Coming up in 2019

 

Heard about the digital in the Humanities, but wondering what all the fuss is about?

2019 will be a busy and exciting year at Te Pōkapu Matihiko o Te Kete Aronui | the Divisional Digital Humanities Hub as we explore local.global projects, demonstrate tools, and critique thinking and practice in the digital realms.

Continue reading “Te Pōkapu Matihiko o Te Kete Aronui | the Digital Humanities Hub: Coming up in 2019”

[Open Hours February 8th] Finding & Cleaning Data

Join us on Friday 8th Feburary for a one hour discussion between 12 noon and 1pm at the Digital Humanities Hub for our weekly Open Hours!

Topic – How to find and clean data

Humanities data is often messy and hard to access. This session will introduce us to using OpenRefine to access and clean up data, using DigitalNZ collections as an example.

Readings

A Gentle Introduction to Data Cleaning

The Quartz guide to bad data

WHEN: 12pm – 1pm, Friday 8th Feburary 2019

WHERE: Digital Humanities Hub, Room 1W3, First Floor, Arts Building

WHO: Anyone in the University community – there’s no advance registration required, but we always appreciate knowing in advance if you are planning to come along!

LIBRARIAN: Chris Seay chris.seay@otago.ac.nz