Tuesday, December 31, 2013

New Year - new goals? Nope.

The start of a new calendar year comes about mid-way through the school year.  While everyone else is setting new goals, making new resolutions (or revitalizing their old ones), I’m taking stock of where I am in meeting the school year’s goals and my goals for the teacher training that I do.  
I work with a lot of different teachers and SLPs in different districts.  Mostly I am teaching them how to implement aac use in their classrooms, how to get students using aac to increase their language and communication skills, and how to grow students’ language and literacy skills.  
The goal is all about communicating.  Not just the students’ communicating, but mine, as well.  Am I helping teachers understand?  Am I communicating adequately the skills and principles involved and helping them feel comfortable implementing them?  And how are the teachers doing?  Are they really listening to me?  (Yes, most are!).  Some teachers and SLPs start this journey with some trepidation.  If they are not familiar with aac systems or with working with students who use them, they may be initially resistant or shy or a little afraid.  I try to show them that it’s not hard, and not really all that different.


So, as the new year rings in and we gear up to go back to school, it is time to take stock.  For the new year, I will leave you with some information about aac in general (a free handout), 

and some guidelines and ideas for implementing aac and increasing language.  Have a safe, happy, and successful rest of the school year!

Monday, December 23, 2013

Story Elements/Story Grammar - Important Language Skills

Once students have the ability to sequence events in their daily routines and have moved on to providing personal narratives about their experiences, it’s time to talk about story organization.  The ability to re-tell stories provides children with skills they need to understand stories, to grow their language complexity, to interact socially with others, and, eventually, move into learning from informational text.
Typically developing students get lots of practice retelling stories by “reading” to their stuffed animals, dolls, and other toys.  They “read” to their parents.  And some talk about what they’ve read with peers.  Students with significant language problems rarely, if ever, get these experiences.   They don’t have the skills needed to retell stories, and without this practice, they don’t learn how to.....re-tell stories.  Quite circular, isn’t it?
Typically, our students work well when provided with visual cues/supports.  There are lots of different versions of visual cues for story telling and re-telling, as well as developing narratives.  At the simplest level are symbols for Who, What, When, and Where to pick out the main ideas and facts.  Sometime we may add Feelings or Opinion symbols; either to prompt an identification of a character’s feelings, or to express how the student felt about the story.
There is a lot more to a story than those bare bones, however, and we strive to support kids in finding all of the elements in a story structure or story grammar.  These include identifying what event started the story or began the action, identifying how the character felt and what (s)he planned to do about it, listing the things the character did throughout the story and their effects, describing what impact the time or place had upon the events or character(s), what the resolution or conclusion was and how the character(s) felt about it.  When I worked with students with language earning disorders we worked with all of these elements; identifying and describing them, sequencing and summarizing them, analyzing them.  But with my students with autism and other developmental disabilities our exploration is usually more circumscribed.
One tool I use is the iPad with a story/book creating app.  Here’s how: Take pictures of the book illustrations and have students place those in order once imported into the app.  Ask them to tell who is in the picture and what happened in the story at that point.  Use symbol cues to remind students what they need to include.  Creating a story retelling with the support of symbol cues and the story graphics provides support for children to formulate responses.
Another great tool, one which I used a lot when I did language therapy, is the Story Grammar Marker and its templates.  Mind Wing productions has grown amazingly since I first heard about the SGM when Mary Ellen first wrote the first manual a couple of decades ago.  Check out their website here.  There are some fabulous products for helping your child/student talk about stories, both personal and written.


In the meantime, enjoy another tool I use a lot - a story element die.  Laminate, cut and paste together this die.  Students take turns rolling the die and identifying or talking about the element pictured.  Works well in history, too!

Wednesday, November 20, 2013

Language and Literacy: The iPad in the Special Ed Classroom

I have been talking about some specifics of using the iPad and apps in the special education classroom for developing language and literacy skills.  I have done a number of workshops and seminars on the topic, predominantly in the Southern California area.  If you haven’t been to one, you can access a handout in my TPT store here.  It speaks to iPad content for language and literacy, and using iPads for aac.  Just as language and literacy are intertwined, so are augmentative communication and literacy intertwined.  I address, as I have to some extent here in this blog, adapting activities for nonverbal and minimally verbal kids, and apps to help achieve those goals.
Evidence based practice calls for 90 minutes per day of reading instruction for general education students.  You can add 30-60 minutes for struggling readers.  How many students with autism or significant disabilities receive 2 or more hours per day of reading instruction?  While the research on literacy instruction with nonverbal students is relatively young, in the words of David Yoder, “No child is too anything to learn to read.”
In the posts on guided reading, I have spoken to ways to use books and story apps to increase language skills and build on literacy development.  I hope you have found them helpful.  Check out the handout, if you’re interested in more depth.


Keep reading!

Monday, November 4, 2013

Guided Reading with Story Book Apps

So, I suppose the next logical discussion is about using iPad storybook apps in guided reading sessions.  So many of the kids I work with have iPads - mostly for aac - and almost every classroom I am in has additional iPads for the teacher, as well.
Many of these kids just don’t get “grabbed” by turning the pages of a book, especially one that doesn’t “talk” to or “read” to them when they’re alone.  The advantage of story book apps - at least the good ones - is that not only are they bright and engaging, but they are interactive, offer feedback and additional input, and make reading fun.
As with everything else on the iTunes store, there are plenty of bad story book apps - or at least mediocre ones.  There are some really great ones, too.  I love the Nosy Crow apps.  They are fun, colorful, the characters are interactive as well as the text, and I really like the commenting that the characters do.  Great modeling!


There are, of course a lot of the same stories you find in print.  The famous cat in the red and white striped hat, the Little Critter books, Berenstain Bears, and Sesame Street.  I admit to being ridiculously happy to see one of my old favorites - The Monster at the End of the Book - in an app.  Using stories that are familiar - even if not age appropriate - can help with interest and attention, as well as comprehension.
There are also multiple versions of many of the old stand-by fairy tales.
While you’re reading the story apps - or the iPad is reading them for you - have students describe the characters, describe themselves, compare and contrast the two, describe the setting, describe where they live, compare and contrast the two, tell what happens in the story, tell what they think might happen next.  Point out unfamiliar words.  I like the apps that say the word when you click on the graphic.  Can they tell you what it means, or what kind of a word it is? 
Most reading comprehension failure is due to lack of vocabulary acquisition.  Typical students understand much vocabulary through their experiences.  Many disabled students don’t have those experiences.  Use the story book apps to highlight vocabulary, talk about the words, build word knowledge.  Play vocabulary games with the words, then go back and put them in context in the story apps.  

Have fun with the great story book apps out there,  and add them to your therapy, teaching, or parenting arsenal. Keep reading.  Keep talking.


Wednesday, October 23, 2013

Guided Reading for Special Education/Language Learners


Shared and guided reading are great ways to build language skills and build engagement.  Building interactions around books also can encourage children to want to read more, to learn to read, to find enjoyment in reading.  Shared reading has been demonstrated to be one of the best influences on later vocabulary and reading skills.  Having interactive conversations with kids around the book being read generates vocabulary knowledge, inferencing and predicting skills, and develops higher order thinking when the right types of questions are asked.  While reading, the teacher (or parent) pauses for predictions, to ask questions, make explanations. The reading is interactive.  One purpose is to expose students to stories they may not be able to read themselves; providing experience with richer vocabulary and syntax.  Another purpose is to provide those structured interactive experiences with specific questions and prompts that enable the students to build language - and reading - skills.

In special education classes with students with more complex communication needs the focus is frequently on a combination of reading strategies and listening comprehension strategies, rather than always on specific reading strategies.  Students may have specific comprehension “props” to hold and manipulate. They may be told to listen for specific information, and can hold up these props when they hear what they are listening for.
There is a purpose for reading established before the book is read, and comprehension activities afterwards.  Ideally there is a pre-reading activity that matches the purpose for reading and the post-reading comprehension activity. 
A good quality reading session allows opportunities for students to participate.  

Ask open-ended questions.  Pause for students to fill in predictable words.  Elaborate on students’ responses.  Point out new or interesting vocabulary.  Move from asking questions whose answers are easily visible on the page - particularly in illustrations - to questions that compare, contrast, infer, predict.  If you need to help the student with a response,  re-read the part of the text that has the answer before providing a model.
Above all - keep reading and keep talking!



Monday, October 14, 2013

Phonological Awareness Skills for AAC Users


Sometimes teachers are unsure how to teach or evaluate phonological awareness skills in students who can not speak - and therefore cannot “say” letter sounds or words.  I frequently remind teachers and therapists that teaching these skills to nonverbal students is really not different from teaching to verbal students.  Only the mode of response is different.  Rather than making a verbal response, students will respond using the same mode they use for all other expressive tasks - picture based communication.
For example, we ask students to find words that begin with the same sound as a spoken word.  OK, we use pictures for this task, and that’s easy.  When we ask students to name words that begin with the same sound as a spoken word, we need to make sure that they have a sufficient aac system, or we need to provide a choice of words from which they can choose.
Light and McNaughton have some good information on their program available.  In essence,  a choice array is provided for all tasks, from identifying rhyming words to sounds to number of syllables in words.
Here are some examples for you:






Monday, September 30, 2013

Literacy for AAC Users - a beginning


Today I want to talk about literacy activities for kids who are nonverbal and use augmentative communication - particularly picture based communication.  You probably won’t be surprised to learn that most of these students are never taught literacy skills.  And yet, how can students grow into adults who have a place in a social environment, a work environment, or an academic environment if they can’t read and write?  Even many severely disabled students who are verbal do not receive sufficient - or any - literacy instruction.
Evidence based practice says that students should receive 1 1/2 hours per day of instruction.  These are general education students.  Students who struggle with  reading skills should receive an additional 30-60 minutes per day of support and instruction.  Wanna take a guess how many hours of literacy instruction most aac users get?  Well, if you guessed less than 1 you’d be right.  Many get none at all.
The research is relatively new on reading instruction as evidenced based practice for aac users, but it is there.  And it needs to get into the classrooms.  And many teachers aren’t sure how to go about teaching reading to kids who can’t make the sounds, reas the word out loud, demonstrate fluency or comprehension.  David Koppenhaver (2008) once said, “I’d argue that you teach reading to AAC users the same way you teach any kid - balance, balance, balance.”
So, what do we know about kids learning reading?  First of all, they need to be read to.  Parents and teachers need to read to kids all the time.  Lots of different books, and a variety of kinds of books.  By reading to them we help to provide motivation, to teach the process of reading, to give them experiences with books that they can’t read on their own, to provide meaning to the whole idea of written language.  Talking about stories as we read them builds language and thinking skills, and can facilitate reading comprehension.  Shared reading, with an adult or older student, has been shown to be one of the best influences on later vocabulary and reading skills.  Having interactive conversations around the story adds to background and vocabulary knowledge and higher order thinking skills when the right questions are asked.
We have a tendency to ask “What” type questions.  What?  What is it?  What do you see?  What is that?   These are not the questions that develop higher order thinking skills or encourage language development.  We need to be asking questions that use a variety of Wh-types and that ask for sequencing, describing, retelling, comparing and contrasting, feeling.  We need to ask open-ended questions, to make sure we give kids sufficient time to response, and to elaborate on their responses.

We also have a tendency to read less to our kids with disabilities, and to forget to use the same basic structure we use when teaching their non-disabled peers.  When we do read to them, we don’t ask as many questions, provide as much interaction, or prepare them for the experience.  We don’t set the purpose for reading, activate their background knowledge, or provide activities related to that purpose.  And we don’t often give them the opportunity AND the means to get practice in retelling the stories.
This last is very important.  We know how important it is for kids to gain confidence in pre-reading skills by re-telling stories to friends, parents, dolls, stuffed animals.....  Our aac users don’t get this practice with  vocabulary and syntax, event sequencing, and more.  When we give them the symbol supports to re-tell     stories, it increases their engagement, increases opportunities to build language, and increases their expressive vocabulary through books.  While this isn’t directly increasing their literacy skills, it is increasing their language skills, which are crucial to the process. (Use the link above to get my free story elements/re-telling symbol supports die.)

Next time, more specific skills for phonological awareness.


Wednesday, September 25, 2013

What Can I Do with a Single Communication Picture or Simple Message Button?


In a recent discussion with a teacher, I was asked what I would recommend doing with all the simple single message devices and assortment of picture communication symbols she had floating around.  A number of the kids (with autism) in her class have iPads with aac apps, but she felt like there ought to be something she could do with the BigMack and iTalk and Sequencer buttons.  And what about just having some pictures here and there?

So, here are some of my suggestions to her - and to you:

How many ways can you use 1 picture?  To request (I want, want, that, give, get), to greet (Hi, Hello, hey there, Bye), to accept or reject (yes, no, don’t, not), to protest (stop, don’t, not, away), to direct ( there, here, give, get, put), for cessation or continuation (more, stop, different go), for possession (my, mine, your, his), to participate (yes, no, repeating line, specific response).  Think about all of those communication functions.  It is much more functional to provide a variety of intents than to teach the same number of nouns.
Make your own picture dictionary.  Print out pages of category related nouns and verbs by location or association, adjectives of size, color texture, adverbs, prepositions, etc.  Make two copies of each page, and laminate one set and put one of each into a page protector (or you can laminate them both). Cut apart one set and velcro them to their match, which you have now conveniently put into a binder.  Voila!  A dictionary of picture words you can use whenever you need them.
Teaching vocabulary for 1:1 correspondence, defining, describing, locating, synonyms and antonyms, category associations.  Make word webs, story maps, word banks.  Practice sentence making.    Sort by same/different color or initial sound or ending rhyme (word family).
Take those single message buttons and program a repetitive line from a story book, a message like “turn the page,” a greeting, a joke of the day, an attention getting response, or “I need a break!.”  Put it in the middle of the table at lunch time and have it say “More, please.”  Record one core word at a time to teach those kids who don’t have their own iPad (or other speech output system).  Now you can practice help, more, stop, go, all done, don’t, make, play, eat, drink, read, sleep.

Pick up those discarded talking photo albums and make shopping lists, picture recipes, personal information, the steps to a task, lines to a song, social stories or scripts.

I hope that was enough to be going on with.  And maybe I got you thinking about some additional ideas.  I hope so.

I’ll be back with ideas for literacy.


Wednesday, September 11, 2013

SoundSwaps App for Listening to and Moving Sounds in Words


Some students have difficulty with awareness of individual sounds in words, of whether two sounds are the same or different, and of the order in which to put them to form a specific spoken (or written) word.   These students have difficulty with discriminating speech sounds in sequences and perceiving and comparing the different patterns in sequences in words.  They cannot judge the differences between sounds and may delete or add sounds and syllables from words.

The goal of the SoundSwaps app is to assist students to improve decoding and encoding skills through improved auditory conceptualization.  (It’s great practice for all students, but was originally designed for students with dyslexia and auditory processing disorders.) Students will practice seeing and hearing words and learning where and when sounds are deleted, added, or moved to make new words.
This app uses errorless learning.  The app repels incorrect choices without drawing attention to the error.  There is never a “No” or “Incorrect” message, no negative feedback, no red X.  Just a subtle refusal to go someplace incorrect, and then visual cues for correct responding.
The positive response is continuous.  There is verbal reinforcement after every trial.  There is greater reinforcement after each sequence of changes.  The letters spin, jump, and fly off the screen at the end of each string of words, with accompanying whistles.   And there is applause with whistles after completing each level.
In the SoundSwaps app, letters come onto the screen one at a time.  When all of the letters are on screen, it says the whole word i.e. bag
Then it says: If this says “bag” make it say “bat”
Speak 1 and Speak 2 buttons allow the user to hear the words repeated as often as necessary.
There is a trash can on right lower corner of the screen into which letters that are not needed in the new word are dragged, so that they can be replaced with the new letter.  Alternately, dragging the new letter on top of the old one will simply replace it.
An on-screen keyboard pops up with consonants in blue and vowels in yellow.  When the correct key is touched the letter pops out from it to be dragged up to the word.  (The key with its text remains in place.)
In Step 1: Some keys are hidden; reducing the complexity of the task.
in Step 2:  The full keyboard is used, still with the colors differing for vowels and consonants.
In our example, the b and a when touched can’t be dragged to the trash or moved in location, but the g when touched can be dragged to the trash can.  If the user tries to drag letters for 2 unsuccessful tries,  the trash can will light up and pulse or the correct letter to use will pulse. (This depends upon what the user is doing/not doing)  
On the keyboard only the correct letter key can be dragged up to the open space in the word  (the key itself doesn’t move - the letter remains on the keyboard, but a duplicate moves up, as stated above).
After 2 incorrect tries (touches to wrong keys on keyboard) the correct letter(s) will highlight and pulse (in sequence if more than 1) as prompts/cue.
The data tracker will track % correct and % prompted, which activity, which level (1 or 2)
Level 1 - Word Families: The ending sounds stay the same.  Only the initial sound changes.  
Level 2 - Initial or final sounds change or are added/deleted
Level 3 - The vowel sounds change.  
Level 4 - Anything can happen.  Changes can be anywhere within the word, so listen carefully!  
  
For those who want a paper version, try these resources:

http://www.teacherspayteachers.com/Product/Word-Families-Task-Cards-772650 for students who need practice with word families.


Get a free sample of the task cards for swapping sounds here.  Note that the app does not use pictures, since the words are spoken.



Monday, September 2, 2013

Answering Wh-Questions - There's an App for That: Question It


QuestionIt is an amazing app - even if I do say so myself.  Fortunately, other people are saying it, too.  I developed QuestionIt from a therapy activity I had done with kids with autism for many years with great success.
Parents were always telling me to publish it.  They all said that multiple therapists and teachers had tried to teach their kids how to answer Wh-questions - all without success.  Until I came along, with my bags of color coded pictures and systematic errorless learning.  Of course, I had no time to even think about what I would need to do to publish this as a marketable tool, so it just never happened.
But then the iOS revolution came along.  Speech and language therapy apps were coming out and I decided this was the time and this was the format.  I found a wonderful team of programers - Ditty Labs - right here in San Diego.  They persevered with me and my lack of technological savvy (once you take me away from AAC, technology sometimes mystifies me).
SO.  QuestionIt.  Question It  (?it) is the app for teaching students with autism and other significant language disorders how to answer Wh- questions.  Four activities use systematic fading of color cues and errorless learning techniques to teach students what type of word answers which type of question.  Data management feature allows speech pathologists and teachers to track data for individual students; providing accuracy percentages for each type of question in any given session.  The SymbolStix icons used are familiar to many students who use visual cues and symbol-based communication.

Activity 1 asks students to sort words into categories by the type of question they answer.  “Boy” is a person; a person answers ,”Who.”  Full color cue, partial color cue and no color cue levels are available.  Hundreds of words are provided for sorting people, places, times and actions.  Errorless learning is used; allowing only the correct response to be registered.  Verbal feedback is consistent.  Fireworks are fun motivators.

Activities 2 and 3 present sentences and ask Wh- questions in random order.  While Activity 2 uses the same sentence structure for all sentences (Who- is Doing What- Where- and When?), Activity 3 offers variations on the word order.  More than 4,000 sentences are available, minimizing student memorization and acclimation. Activity 2 has 3 levels of play; full and partial color cue and no cue.  Activity 3 has only partial color cue and no cue levels.

Activity 4 presents three loosely related sentences as a paragraph, continuing to ask questions in random order.  Activity 4 has 2 levels; partial color cue and no cue.  Again, hundreds of items are available for practice.
Overall, there are more than 10,000 questions in this app.

In all levels, errorless learning is provided.  Only one response is accepted and moves the student on to the next item.  Therapists and parents can control the pace of play by using the arrow to move to the next items after any discussion they want to have.  Every 5 correct responses provides fireworks, which grow in intensity as more and more correct responses are amassed. 
Question it is available only for iPad.
You can find QuestionIt in the app store here.  Check out the website here. http://languagelearningapps.com
And, if you prefer the good old fashioned paper activities, you can purchase the Wh-Program materials from me here. http://www.teacherspayteachers.com/Product/Wh-Question-Program-Bundle-809775 


Tuesday, August 27, 2013

Apps for Developing Personal Narratives


One of the most important skills in language intervention is the development of personal narrative skills.  These are the basis for all conversational interactions and social connectedness.  We talk about what we’ve done or what we’re going to do.  We talk about the fun things we’ve done or the bad days at work. (Or even the great days at work.)I spend a lot of time working on developing these personal narratives with kids; particularly kids who can’t talk.  How can we get them to tell us about their experiences, their feelings, their desires (beyond requesting an item)?There have been some great studies by Gloria Soto and her colleagues and students at SFSU. One of the first apps out for creating personal stories/books was Pictello.  Pictello 
allows you to import your photos, input text, has text-to-speech to record your voice options, lets you decide how many pages, lets you add labels to your photos, and even has a step-by-step tutorial built into the app.
Book Creator 
also offers a tutorial when you open the app.  You can add and resize images, add your text, and create more pages.  When you’re done, you can import the book into iBooks to read it, or Dropbox to share it.
In my experience, Story Creator needs the user to be intuitive.  There seems to be less flexibility in this app.  Each book is like a photo album to which you can add text and sound.  There are crayons with which you can draw your own pictures or embellish your photos.The Story Dice app is just like the actual dice set for stories.  You choose how many die to roll, and are given simple icons to represent the story elements you need to use.  There are some instructions for how to play to construct stories and sentences singly or in groups.Storybook Maker
appeals to a younger crowd, with stickers and a note to send the book to grandma and grandpa.  There are a variety of page layout options (about 12), and lots of options for backgrounds and borders, objects to add that have some animation.
You don’t need a specific story app to retell stories.  Check out ToonTastic!  Puppet Pals, and Felt Board to create scenes and pages.  There is a great post with video for using Puppet Pals to retell stories here. http://mindwingconcepts.com/Language-Development-Literacy-RTI-blog/put_the_story_on_stage_with_puppet_pals Sean Sweeney, who made that video also made this one on using Tellagami for the same purpose.  See it here. http://www.speechtechie.com 
There are many other book maker options for creating personal stories, some of them very inexpensive.  For SLPs this is great news. We can work on constructing narratives about what the child did, when he did it, where he did it, who he did it with, whether he liked it or not, what happened that was memorable, and more.  And then he can tell them again and again, gaining the very important experience of retelling stories (as Carol Westby reminds us) that nonverbal children - and those with language disorders - just don’t always get.  
Another way to use many of these apps for story re-telling is to use the camera in your iPad to take pictures of the characters, settings, and events in the child’s favorite books.  Import those pictures into the story making app and re-create the story in a simpler form consistent with the language level of the child (or just a bit higher).  Now he can retell the story to you, to his teacher, to ....everyone.  This story retelling is something that typical children do all the time.  They “read” books to the dolls and stuffed animals, and even their parents.  But our children with language disorders don’t get this kind of experience, which is vital to their development of conversational, social, and academic skills.
And now, Mindwings, the company owned by the creator of the Story Grammar Marker has an app that goes with her other materials for telling stories and personal narratives.  She has some great handouts available.  Get one here for free.
http://www.slideshare.net/mindwingconcepts/mindwing-autismhttp://www.slideshare.net/mindwingconcepts/mindwing-autism


Monday, August 19, 2013

Apps in Intervention


     There are so so many apps out.  And we all want to use them.  But how to find the “good” ones?  There are so many to wade through and you can’t possibly look through them all.  So, I am going to post a few suggestions here about apps I like to use.  

     My absolute favorite - and this is true for many SLPs I’ve spoken with - are the Toca Boca apps.  They are just a lot of fun and can promote interactions in a genuine way.  Because genuine communication is what we’re after, right? The first Toca Boca app I ever used was Toca Tea Party.  A flood of memories of all my daughter’s tea parties when she was little, and excursions to fancy hotels for “high tea.”  I love using this app when I do evaluations. (1 iPad for interaction, 1 for fun apps - what a lot of technology!)  We can talk about which plates or cups or treats we want.  We can comment about being hungry or thirsty or how good something tastes.  We can ask for more and say we’re finished.  Depending on your student/client there are some fun conversations to be scaffolded.Of course, I see more boys than girls.  So, I have “had” to acquire Toca Builders, Toca Kitchen Monsters, Toca Robot Lab


(my favorite), and many more.  Again, just talking through the actions and sequences provides opportunity to input language and scaffold responses.  These are really fun and engaging apps that the kids (and some older clients) really like.  They stay focused and whatever they do within the app is an opportunity to generate a language response.  

     Another favorite is Bamba Burger.  Recently I spent almost half an hour with a young man with autism building burgers and cutting french fries.  I had lots of opportunity to make comments about his choices for toppings (fish? octopus?), and I got him to talk about what he wanted and whether he liked it or not.  He also liked their Ice Cream Parlor app.

     Other apps for building language: Educreation allows you to make lessons your students can watch over and over again.  Quizard lets you make your own flashcards,  Talk about it objects asks students to “tell me everything you can about a cat.” See Touch  Learn also allows you to create custom flashcard sets.  

     But, to be honest, I’m not into flashcards any more.  Even Lovaas acknowledged you can’t teach language in discrete trial drills.  Work in context.  Create those contexts.  That is what makes the Toca Boca apps work so well for working on language.  Many of the contexts are familiar to kids, and the others are play routines that give our kids room to grow. One thing I am constantly telling SLPs and teachers is this: “Therapy with kids with AAC systems or with other technology is not any different from the therapy you have always been doing.  It’s just a different mode of expression, or different “virtual” toys to play with.”Have fun !

Coming up next: apps to develop personal narratives, apps for guided and shared reading, and, of course, my own 2 apps (QuestionIt and SoundSwaps).

Friday, August 16, 2013

AAC Assessment - When we say, "Help! I don't have enough equipment."


One problem that school based SLPs have when it comes to aac assessment is a lack of equipment.  A good aac assessment tries a variety of options with a student, with different symbol sets, symbol sizes, array complexities, and other software features.  Organization of vocabulary is one of the biggest features to consider, overriding almost everything else.  There are few assessment centers that offer a full range of assistive equipment to try.  But there are also few districts that can provide SLPs with such a range.  So, how do we make good choices based on effective assessment? Recently we have been hit with an iDevice explosion.  Many districts are even by-passing assessments and offering kids and their parents an iPad with aac app as soon as the words, “AAC evaluation,” pass their (or their advocate’s) lips.  Many SLPs fear this is subverting the process and may not provide the students with the system that really meets their needs. The iPad revolution has, however, provided us with a tool that we can use effectively - and relatively cheaply - to help us in the assessment process.  There are many many aac apps available in the iTunes store.  However, way too many of them represent nothing more than 1 more choice board, in my opinion.  There are few robust aac apps that offer well organized vocabulary sets, a full range of symbols to use, flexibility in array size and complexity, and layers of navigation that are systematically well organized.  Among them, I believe, are Prooquo2Go (particularly in its version 2 and beyond updates), Sono Flex, LAMP Words for Life, and Touch Chat (various versions).  Each has its pros and cons, and I won’t make a plug for any of them here.  Sono Flex is the least expensive of them, at about $100. For a little less, I have purchased Go Talk Now (about $80.) and taken advantage of its many features to create a large range of assessment activities.  I use these in conjunction with activities from other apps and, of course, paper-based systems that range from a small-choice array through the various display options of the PODD and Pixon books and boards. (more on these later).   I can create pages with 1, 4, 9, 26, or 25 buttons per page, link them in any navigation pattern I wish, use symbols from the app’s library (which includes some realistic photo-like symbols as well as standard symbol libraries), combine more than 1 symbol on a button, resize the images and text, import photos, videos, and audio tracks, and more.  I have created a master home page that links me to different options of linked page sets.  Now, within an assessment session, I can move almost seamlessly from a 4 core word display to 25.  I can change button and background colors and add audio information for kids with cortical vision impairments.  I can choose between recorded voice and synthesized speech - in multiple languages.  I have made multiple activity based pages with core words in stable locations for activities that are preferred by many of the kids I see, in sizes from 4 to 25 buttons per page, and can see how far I “push the envelop” within the scope of a session. I hope that’s give you something to think about - and a project to keep you busy.  


For a more complete explanation with screen shots and app descriptions, I have my presentation Lightening the AAC Tool Kit Load  (originally presented at Closing the Gap 2012) in my TPT store here.


Thursday, August 1, 2013

Vocabulary Organization


Hi.  And welcome back as I continue to talk about AAC this month.  This is a huge topic, of course, and I’ll be coming back to it periodically throughout the year.  This month I’m just trying to talk about some of the basics.  So, today’s topic is vocabulary organization.
This is a topic that has been debated throughout the relatively short history of aac.  It may be the most hotly debated topic in ASHA, and has been the source of multiple guidelines and proposals.  I do think, however, that we may be closer some concensus.

So, here’s the basic debate.  Proponents of the use of core vocabulary in aac posit that much time is spent -wasted - providing unending lists of content based vocabulary (Baker, Hill, Devylder; 2000).  They prefer to use core words and broad meanings; particularly words with multiple meanings and high frequency of use.  And they focus on use of words only - no static phrases - to enable the user to combine words to construct genuine messages.  Static phrases do pop up in social chat pages and other high frequency places to enhance the speed of communicating in some contexts.  Core words are re-usable words.  Use of core words minimizes the “real estate” needed on a display page, and gives the student access to the most often used words to create their messages.

On the other hand, others ask, “Why limit vocabulary access?”  Does access only to core vocabulary limit students’ messages?  Should we focus instead on pragmatic intent and how to use language instead of word selection?  Shouldn’t we teach children to use a variety of functions and show them how to find the vocabulary they need for each of those, while providing them with a rich and varied vocabulary to use in all contexts?
Gayle Porter’s Pragmatic Organized Dynamic Display books use pragmatic branch starters to define message intent first and foremost, then teach how to find the words needed for the message based on the intent.  These books provide a rich and varied vocabulary to meet all needs and provides a built-in system of navigation conventions for dynamic display.

And then there are those who continue to use topic or category-based communication boards/pages, moving students to  core vocabulary as their fringe vocabulary grows.  As speech-language pathologists many of us think of words in terms of categories and functions.  Many communication systems have been built, historically, around basic categorization - words for things to eat, for things to play with, etc.  Unfortunately, few - if any - of these systems have provided a way to vary function and increase narrative or syntax.  These systems provide groups of words that are predominantly nouns.  There may be a page of verbs, or a couple of specific verbs for each noun group.  Some adjectives are provided usually - particularly colors, shapes, and feelings.  There is limited ability to construct genuine messages, build syntax skills, or move beyond requesting and some limited responding.

Just a brief note here about core vocabulary for those uninitiated.  Core vocabulary is that smallish set of words we use for most of what we say.  The average toddler uses only 25 words for more than 95% of what they say (Benajee et all 2003 ).  The average adult uses only 300-500 words for about 80% of what we say (outside of professional vocabulary).  Core words are identified through word frequency, reflecting the most commonly used words in a language.  These are composed of pronouns, prepositions, determiners, conjunctions, and verbs - but not nouns for the most part.  They are words that provide substance to a message and, in aac, are words that help provide information when a specific words is not available.

So, back to the debate.  You may be a proponent of one school or the other consistently throughout your practice.  Or, you may make a thoughtful decision each time you put together an aac system for a student, considering the individual needs and circumstances.  But what you always need to do is to organize the words the student is going to have and teach her how those words are related.  We may teach “go” and “ride” in the contexts of their use, and ow to use them in multiple contexts for different meanings, but we still need to teach that these words are about doing something, and that the doing something words are in (X) location in the system.

On a core word display the word “go” is among the first words taught, and available on the front page.  “Go” is taught in many contexts: get in the wheelchair and go someplace, go on the potty, get in the car and go somewhere, make it go by turning it on, go start an activity, go away and leave me alone.  In all cases, the focus is on that one word on the first page in that one location.

In a pragmatic display, the focus is on the type of message.  Yes, “Go” is right there on the pragmatic branches page.  But it leads to the places page to tell where you want to go, or where you went, or where you are going to go, or where you are going.  Using this “go” button takes us to options of where that somewhere is that you go.   To ‘go start an activity’ you would start with the ‘I want to do some activity’ message button, and then from there explain what wants to ‘go.’  You’d find “go away” on the first page of “quick words” because it is used a lot and needs quick access. In this sytsem, to say “Let’s go” you request to go.  To ask “Are we going” you’d start with a question intent.  To tell about going, you’d go to the telling about or telling a story button(s).  To ask to go play, you’d start with the activity button.   The emphasis here is on the intent of the message.  We teach the students how to indicate the type of message they want to deliver, as well as a consistent navigation system to control the content of the message.
Below are the 30-location Pixon board from Gail VanTatenhove and the 20 location PODD one page opening pragmatic branches page from Gayle Porter.



Monday, July 22, 2013

Static Display in AAC


AAC refers to communication approaches that augment (add to) or serve as an alternative to speech.  It includes all methods that make communication easier or possible.  There are a great many options that fall under the heading of Augmentative and Alternative Communication.  A functional aac system is a compilation of strategies that allow the student to communicate effectively in an array of intents in an array of contexts, with all communication partners.  
At the no tech end of the scale are paper based systems that can range from simple picture displays and rotating choice options to complex Pragmatic Organized Dynamic Display books with more than 100 pages and Pixon boards of more than 100 icons.  At the high tech end of the range are a variety of dynamic display computerized systems; including dedicated devices, computer software, and tablet apps.  
In the middle are static display voice output devices.  These can have from 1 to 128 buttons, thought typically there are no more than 16 or 32.  These devices use recorded human speech and paper overlays that require someone with sufficient dexterity to change the overlay and, sometimes, the device ‘level’ button to match that overlay. Problem 1: Access. Those devices that offer multiple ‘levels’ to be pre-recorded to make the change of topic quicker are usually restricted to 7 or 8 or 12 levels.  Problem 2:  Breadth of vocabulary.  Each button can be a word, phrase, or sentence (or more).  But, typically, there is no ability to sequence buttons and retain them someplace from which they can be repeated as a fluid whole message.  Problem 3: Message construction.  Problem 4: Literacy.  If one can’t sequence buttons into a whole, one can’t create words.  There are no text-to-speech options here.
Static display devices are frequently used for choice-making or for responding in a specific context.  There is not sufficient vocabulary to meet all of a student’s needs.  There is not room for off-topic messages, clarifications, or comments.  There is little or no ability for genuine message construction.  Few of them provide scanning access for those who can’t direct select.  In most cases, the vocabulary location is not stable across overlays, making learning more difficult and obviating learning through motor planning.  In classrooms, these displays are most frequently used as activity-based displays.  The vocabulary is provided as needed in order of the activity and messages are chosen based on the steps of the activity. 
At the best possible scenario with static display, one can create overlays that use core vocabulary that is stable on every overlay and fringe vocabulary for specific topics, and have enough to cover a number of contexts - but likely not all.  Most kids can’t change the overlays themselves, so need a way to signal a change of topic to their partner.  Most kids are going to be limited in vocabulary learning and use with these limited displays and topics.  Most kids are not going to learn how to construct novel messages with a limited array of word choices.  Most are not going to learn literacy skills without the ability to sequence sounds. So, as a primary component of an aac system these devices fall a little flat for me.

This is not to say that static display devices don’t have their place.  They can be a valuable piece of an aac system.  And remember, aac is a system with multiple parts.  We all use more than just one way to communicate.  In fact, there are some amazing statistics about just how small a percentage of our communication is verbal.  But, back to the topic at hand - static display devices can be well used in a variety of communication contexts.  Given sufficient ‘buttons’ per page, it is possible to provide some basic core vocabulary for a small array of functions, as well as the fringe vocabulary basics needed for the activity or topic.  But for students to participate in their school environments, communicate effectively with their partners, and learn to use language effectively, I do not believe static display devices are sufficient to meet communication needs.  Click here to download a tech speak overlay template that provides some core vocabulary, as well as spaces to insert your own activity-related words.  
Please note that all symbols used in these displays are from Mayer-Johnson Co., a division of Dynavox.  The Picture Communication Symbols ©1981–2010 by Mayer-Johnson LLC. All Rights Reserved Worldwide. Used with permission.
Boardmaker™ is a trademark of Mayer-Johnson LLC.