HP Reveal, UDL, & AAC!
I am having fun discovering the newly renamed HP Reveal. The first iteration of this app was called Aurasma. This app allows you to create “Auras” associated with particular objects or images on an iOS or Android device. It allows you to use augmented reality in some pretty exciting ways.
In other words, you can take a picture of an object, or an image, and associate content with that image. Once your Aura is created, you can use the app to scan that image and your content, such as a video, will pop up.
How could this potentially be used for supporting students? Imagine associating a dynamic video with an image of the classroom word of the week. If you are teaching the word “speed”, you could add a video of a fast train or a cheetah. Use the app to create Auras to add to your classroom word wall.
How about a dynamic visual schedule? Or task analysis? If a student is learning to wash her hands, you could use the app to view short videos of each step. An adult who is learning new tasks in a work environment could do the same thing. This has the potential to help that person become more independent, and to stay on task with what they need to do.
You can use the online Aurasma Studio to create Auras that can be shared. A teacher could have students follow their account. In that way, the Auras can be viewed on multiple devices. You do have the option to keep Auras private.
Flip this around and you could use the app to allow your students to create videos to aid in presenting their work. Do you have a student who hesitates to present in front of the entire class? Have them create an Aura and the other students can access a presentation through their smartphone.
We can take this a step further. You can use videos that you create in other apps, as long as you save them to your camera roll. This provides you with lots of scope for play and accessibility.
For instance, you can use the app Sock Puppets to create an animation with your voice providing the audio. In Sock Puppets, you can choose from different puppets, backgrounds, and animations within the app.
You can also use Clips on an iOS device. This app allows you to take pictures and record video. You can use different filters and add animated stickers to the screen. Clips also allows you to caption your video as you create it! You can place your captions anywhere on the screen, or have them scroll across a red banner at the bottom. These videos can be saved to your camera roll. In a nutshell, students can be creative and have fun as well as make their videos accessible.
How could this be used to support augmentative and alternative communication? Imagine a classroom that uses a core vocabulary poster to teach language to students with disabilities. You could use HP Studio to associate dynamic learning videos with individual core vocabulary icons. In this way, you provide multisensory input for language learning.
Use Clips to create these videos, and they can be captioned! Think about how this supports the goal of infusing literacy into language learning. One tip: think about what you will say ahead of time. You want to keep the auditory and written input ‘one step ahead’ of your AAC learner’s abilities. Working on two-word phrases? Caption two-word phrases.
Students and teachers can even go on a core word scavenger hunt around the school. Imagine taking videos of core vocabulary words “in action”. Those videos can then be used to create auras that students can access when back in class.
Can you think of other ways to use augmentative reality to support learning? Please leave a comment and let me know!
0 comments on “Virtual App Smackdown”