The Digitally Literate Classroom: Reading Inanimate Alice

Another fascinating post from Jess Laccetti’s blog at Frontline that uses Inanimate Alice as the basis for a lesson on digital literacy. What I find amazing is that Jess has managed to clearly discern and articulate a range of decisions that were made as part of an organic design process: ‘this looks/sounds right’, rather than an explicit ‘this should go here because…’. You always hope that these decisions make narrative and multimedic sense – and a lot of thought goes into them – but it is very encouraging to know that they make some kind of theoretical and pedagogical sense too.

One thought I have when reading the lesson plans is the difference between sound and music… does a distinction need to be made there? What influence does the music have on the narrative compared to the ‘incidental’ sound effects?

Another is the role of Brad: is he real, imaginary? What is the relationship between Alice and Brad? How does he know where Alice’s Dad will be found? (This is difficult to discuss without giving the plot – congrats Jess for managing to avoid that!)

4 Comments

  • Jess says:

    Wow, such kind words from one of the creators of Inanimate Alice. Thanks. :)

    IA certainly had a lot to offer students in an educational setting. It was relatively “easy” to delve into the whys and hows of the multimodal effects. It’s part of critical literacy and it’s part of plain old literacy – understanding the story and understanding/analysing how it is told. Basic ideas which don’t seem to be materialising in the context of digi. lit.

    Interesting question re: difference between music and sound. I asked myself that question when I found I unconsciously would choose one term over the other at different points. For me, sound, with ref. to IA, seems more like it fulfills more of a purpose (doors closing, electronic interference) – in fact you say “incidental,” whereas music is more like foreshadowing…the speedy crescendo when Alice is searching for her dad etc…it’s narrative in itself. It’s as though music has a regular pattern of repetition, something more constant that jagged knocks. Does this make sense?

    Re:Brad…that was tricky and that’s why I didn’t include any questions about him. I felt I’d give too much away. But, maybe (at least by episode 3) it’s not that important a question to ask…more important that he is there rather than not?

  • cjoseph says:

    Hi Jess,

    yes, I definitely agree – the sound effects serves specific purposes, while the music is more of a general mood setting. The electronic interference however is somewhere between the two… it is a mood setter, but also it suggests something that is quite crucial to the overall 10 episode arc (electromagnetic radiation), and so it is also performing a specific narrative purpose (beyond that which music naturally does anyway).

    There is possibly another interesting distinction, in that the music is (presumably?) created by Alice as part of her multimedia autobiography, and are thus it is part of the general painting of her character and skillset (aged late-twenties, when she is creating these stories, though this part is not yet made clear) – while the sounds are not?

    Good questions re Brad… again, the development of the 10 episode arc is very relevant to discussions of Brad and his importance, but sadly I can’t say more without giving the story away ;)

  • Jess says:

    Def. don’t give the story away!

    I’ve been thinking about the electronic interference sound and have reread episode 1 and 3. For some reason in episode 1 that noise sounds far more ominous to me but more of a background noise while in episode 3, I’m quite aware of the interference and see it more as part of the story.

    Also, the first time I heard that noise I thought it was my mobile…so maybe it also works as a device that (at least during my first reading time) plays on the pull between interaction and immersion?

  • Chris Joseph says:

    Yes, the EMR sound is definitely louder in episode 1. Too loud sometimes, we thought, so it was reduced by a quarter to a third in 2 and 3. At some point I may go back and reduce the volume in 1 too, but your point about immersion is interesting. Everyone who has a mobile/cellphone has heard that sound when you are near a TV or monitor and your phone is about to ring, or sometimes when it is sending/receiving location data to/from local transmitters. I think even I automatically looked at my phone when working on Ep 1!