Wednesday, November 16, 2016

DevLearn 2016

Here I am again - DevLearn 2016. It's been great so far. 

Day 1: A day-long workshop focusing on interactive video. What does this mean? Interactive video means making a video more interesting than it already is. It might mean putting in a hotspot, or adding text to emphasize a point. It might mean a pause to allow for a question. 

The workshop covered a tool, HiHaHo, which sort of fast-tracks your interactive elements ("enriches" in HiHaHo language) by providing a web interface for you to add the components of interaction.

Pros: Quick and dirty results for interaction, easy to use, ideal for microlearning and informal learning.

Cons: Not suitable for high quality results, video must be edited to a high standard in order to use the product. It's more like you are "dressing up" your existing video (which should be awesome). Delivery method is unclear. Requires subscription to view. 
.
Bottom line: A good use of time, although I would not use this platform to create production content. I think it's a good storyboarding tool. 

Day 2: 

AM
Attended half of the keynote, which was given by Penn of Penn & Teller. He talked about storytelling. Penn thinks PowerPoint is sort of lame and prevents human sharing. Ok. 

Next, headed over to the Expo to try and hear about Articulate 360. Too noisy in room. Could not hear. Headed over to get a cup of tea. Ended up talking to an Articulate trainer who did our training last year

After the trip to the Expo, returned to the main hall, where I hit a Camtasia session entitled, "Eight Things I Hate About your Screencast", which was based on (I assume), this article.  Since I'll be doing video soon, I wanted to see what Mark Lassoff had to say about Camtasia. That was a good use of time. He's also a good speaker. 
Takeaway not in article: Upgrade to Camtasia 9.0, it's worth it. 

PM
After Camtasia, I headed over to get some lunch. I landed myself on a chair in a random session about LRS'. Hmmm. Ok. Learning Record Store. I don't know. I was involved with my sandwich, and couldn't get a lock on this one, but maybe returning to the description and the slides or content (if made available) will jog my memory. 

I hopped over to a talk about PowerPoint. Wow. This was like a tour-de-force of everything PowerPoint can do. It was called "Eighteen Awesome PowerPoint Tricks for Effective Presentations". This guy, Richard Goring, is a great presenter and he's like some kind of PowerPoint savant. We got a nice little booklet with some of the tips, and also a good website to search on videos with more tips! Yay! 

Day 3: 

Made it over to a couple today before catching the plane. First, I went to another loud and inaudible talk at the Expo. It was about how to make your eLearning more efficient. Not much gained, unfortunately, since it was so difficult to hear. 

The next one was extremely useful - this was a discussion about how to make your videos more interactive. The focus was really on having good videos with some kind of easy to understand structure (e.g. branching, messaging). Engagement here was really focused more on making a good video, then adding into it simple elements like hotspots. This really focused largely on marketing education videos, but still, some great ideas. Link here for examples we saw in the talk. 

Thanks DevLearn! Love this conference!

Monday, November 7, 2016

eLearning Challenge #140 + eLearning Diwali Trivia Game

Well, finally got another one in. This one was a cooking analogy. I'm like 10 behind on these, but something is better than nothing.

eLearning Challenge #140 : An ADDIQuTE Kitchen



I also did something for the work Diwali party which worked out when used in a large group - teams of four. eLearning Diwali Triva Game : The Diaspora Game


Monday, October 10, 2016

ADDIQutE - is it good enough to fly?

Before I started doing instructional design work, I taught a lot. Specifically, I taught human geography, physical geography and GIS topics occasionally. The reason someone like me can shift into "just instructional design" is because I'm already used to organizing information.

But instructional design isn't quite the same as teaching at all. Correction - it has nothing to do with teaching. It has a lot to do with training and being very specific about what you want the learner to know. That's the opposite of what some professors (yes, I was and continue to occasionally be called "professor", when I teach online) do. Professors often go on about something. They get off track, lose focus and sometimes miss their own point (not the good ones). They also often have job security. 

I digress. See? I still have it. The professor thing.

But back to instructional design. 

As I get more engrossed in this job, the more I like it in some ways. The only thing I hate about this job is the tight deadlines. I think it produces low quality sometimes and oversimplifies things that often require deep thought. So I ask myself, where can I catch that problem? Do I catch it in Analysis? Sometimes. Do you have a good SME, because that's when you can make sure you include the RIGHT information in the BEST way possible? If you do not have good SME support, you might have to figure out the problem during Design and Development.

After Development, many instructional folks let go and let the trainers handle the Implementation. Evaluation (largely) is ignored or swept under the rug. Managing the results of an evaluation is a type of going backward that management may find a waste of time. It takes work. It's old news. Moving on! Our backlog is too high, so let's "table" that until later...and it goes to the attic. 

So I started thinking, where or when (or both) can I make sure that my content has a chance to get a QUALITY check and get TRIED out? Well, you could put it after things are put into place (this is the way Implementation is defined at my organization), and call it QuT - Quality checked and Tried. Get someone to just try out the material, before it's delivered.

If your organization is more along the lines of "implementation" being the actual delivery, then the order may look like ADDQuTE where QuT replace Implementation. ADDIE works in different ways at different places.

My new addition to ADDIE makes your material ADDIQuTE (adequate). Sometimes that's the best you can do. 

Analysis
Design
Development
Implementation
Quality check
Try
Evaluation

Wednesday, August 24, 2016

eLearning Challenges: #139

After my relaxing work-cation trip to Seattle for the Articulate Roadshow (Bellevue, WA), I felt all refreshed to work on eLearning Heroes Challenges. That was three weeks ago. Finally, I got around to doing one of the challenges, which was really fun. Those challenges are great way to keep your hands on the software and come up with new ideas and resources. I think it took me about 2 hours in total, including the design idea and finding appropriate resources. I really want to keep at it - perhaps after the next deadline I can take a few hours each week to do one.

The challenge I chose this time was to Give These Top Templates a Makeover. Naturally I did not have time to do all the templates, but I chose one to try and play around with - and attempt a conversion to flat design

While that challenge sounds pretty simple, there are some key points to consider. For me, they were:
  • Finding the right color palette (free)
  • Finding some flat design avatars (free)
  • Finding a sound clip (free)
  • Figuring out the overall look and feel (brain time)
These weren't hard to find (all references in the final Storyline file), but incorporating and editing them did take some time.  

BEFORE


MY MAKEOVER
The final result is available at this link.

Thursday, May 12, 2016

Demos (Screencasts)

As my team has been doing more eLearning lately, we've been trying to get more clear about standards to follow. I held a brown bag in hopes of getting folks to share information, which some did willingly. Sometimes it's like pulling teeth, but if you can get someone to talk about their work, that is ideal because it is uninhibited. 
Camtasia is the tool of choice, although some of these concepts apply to tools universally. 

Best practice tips

Visual 

Zoom

Consistent Zoom - recommend 115%
Maximum Zoom - 150% (do not exceed)

Aspect Ratio (Camtasia) - Capture Size

desktopwidth:i:1500
desktopheight:i:684
Frames Per Second
Lower fps means smaller file size. The default file size is 30 fps (Camtasia). 

Recommendation: 15 fps for screencast

Audio

Tool: Audacity (free, and that link will take you to the download page)
Record in Mono
Align recording device to recording software preferences (Sound input)

Align settings to device.
Your recording preferences would be set for  44.1 kHz 16 bit  because that is what is supported by the device. 
Save as WAV signed at 16-bit PCM, no encoding (to prevent distortion)
Other tips:
  • Talk through the screencast as you move around the screen so you don't lose the flow.
  • Pay close attention to sequencing steps for the learner
  • Recording in small bursts is better for editing and reorganizing - particularly because it is easier to move around in Camtasia and Storyline
  • Be sure you have a quiet space (no fans, phones, other distractions)
  • If possible, set up a recording space in your home to do the narrations
  • Speak slowly and pause appropriately


Resources

Screencasts Presentation (2/9/2016 Brown Bag Lunch), Presentation has a lot of links to reliable blogs and tutorials
Lynda provides a Screencasting tutorial and includes overviews for Camtasia and other tools

Thursday, April 28, 2016

When your negative thought becomes positive...

Recently I had a really positive experience taking an online course. The irony was (well, there were two). The first irony was that I thought it would be a slog, you know. Group projects, absent instructors, etc. But this wasn't the case at all. In fact, this was a very positive experience (and continues to be - there are two weeks left). 

The second irony (maybe more of a silliness than an irony) is, more obviously, that it was an online course about eLearning offered through Canvas by instructors at the University of Brussels. On top of that, the group project assignment had a group with four members scattered across the world, literally. So let's just map that out for a moment. 




Prezi is a cool tool.

I was pretty sure that the group project would be a disaster, but it was 100% complete in a week, on time. Everyone did their part, and we all learned new tools. I did not know how well Microsoft Office Mix would work, but it was a breeze. Now, the project itself was not perfect and it definitely would benefit from some serious editing, but it was amazing to me that everyone stuck with it and came up with something that everyone was comfortable sharing with the class (everyone in the class is a working professional).

I'm looking forward to doing a cool final project for the class too.

This brings me to the point of my thought. Taking an online course is an exceptionally good use of time for a learning designer. You pay attention to how things are presented and designed, but you are also trying to learn a new skill/concept, so you are really immersed both in your craft and your learning motive. This experience has informed me and, suprisingly, inspired me.

Monday, January 25, 2016

Super Excited Because Agile Has Come to my Corner

UPDATE: 
With a move at the office, our whiteboard got "shut down" (there was no room for it). However, I found a great virtual whiteboard app I'm trying out called Realtime Board. It looks something like this (this is the demo):



PREVIOUS POST:
When I'm in a meeting and I hear words like BUY IN and APPROVE and SIGN OFF, I get really upset. Do you want to know why? Here's why:



Agileprocess
The lower arrow is better

The BUY IN happens at the beginning in the first arrow. See how much failure is likely? 

The second arrow gives everyone a chance to get comfortable and "BUY IN" (if you MUST use that terminology) every time there's an iteration.

My new colleague and I are trying out Agile now because it just makes things far more transparent in terms of what we're doing and what we have to do. It is a better approach than having pointless meetings with too much management speak and not enough action.

Our board

Monday, January 11, 2016

Quality II: ADDIE application

From the last post:


User-based Quality, Product Quality and Manufacturing Quality all have characteristics that help me hone in on a quality model for learning products. 

Let's now impose ADDIE onto this trifecta of magical characteristics. 


ANALYSIS: In ADDIE, the Analysis is a needs assessment. In practice, this is when we ask ourselves the following questions:


Who is the learner audience? 

What is the mode of delivery? 
What subject coverage is desired?
What resources are available (SME's, documents, etc)? 

What are the constraints (it's usually time)? 
When does the project/module/learning piece need to be done? 

Garvin says: The characteristics that connote quality must first be identified through market research (a user-based approach to quality).


I'd say that each of these components of Analysis contributes toward high quality. You should be able to identify each of these as specifically as possible. In my experience, the more vague each answer is to each question in this list, the lower the quality. I'd love to back that claim up with a real test case or multiple test cases. I actually can. I can off the top of my head think of three distinct cases (not all my work, though one definitely was) where poorly defined audiences resulted in poorly defined courses that were not well-received. On the other hand, I can think of one course I handled that was well-defined in terms of audience and coverage. The course was well received. 


DESIGN AND DEVELOPMENT: Looking at these two phases, in ADDIE, we are basically talking about good design and good work. 


Design is the rigorous process of identify learning objectives for the material and for the audience, and then creating a storyboard or prototype to see how things fit together. Suffice to say, if you didn't clearly define your audience, obviously the learning objectives are going be more vague. If you did not limit your subject matter coverage, you may be dumping rather than designing. Avoid the dump. 


Good design checks:


ALIGN YOUR DESIGN: 

Define learning objectives and verify objectives with another party (SME ideally) 
Create a storyboard or outline
Does the storyboard or outline include graphical representation? 
Does the storyboard or outline refer continually to the learning objectives, and can you identify when and where? 

Development is where actual work gets done. There is a lot of room for quality checks during development. This falls into the iterative SAM/ADDIE/Agile feedback cycle. So feedback is definitely an important quality check that may benefit the end learning product.


ITERATIVE DEVELOPMENT:

How many feedback cycles has the work been through? With whom? 
Has the work been revised based on feedback? Why or why not? 

In summary, the product-based approach to quality rests on the idea that "high-quality characteristics" must then be translated into identifiable product attributes. Perhaps presence and absence of the attribute is good enough. Or maybe a combination of presence/absence and then some "standard" or level of the attribute would help to measure quality.

IMPLEMENTATION/EVALUATION: 
Implementation is what happens when the course is done and it gets delivered. Ideally you'd do a test run rather than dump the entire developed course into the lap of a trainer (or a student) and just hope for the best. However, this is how a lot of information is shared. Heck, I've done it myself. You dump everyone in the pool and see who can swim. 

This is not fair. Well-designed material is a representation of not only knowledge, but of your corporate brand/entity. Poorly designed material is, quite frankly, a downer.


How can we make the implementation and evaluation steps critical to quality? 


IMPLEMENTATION

Run a pilot (expensive) 
Test a component of the learning and evaluate it 
Train the trainer (train or communicate details)
Repetitive tasks such as uploading files have a quality check or are automated

EVALUATION

Are students able to perform the tasks set forth in the learning objectives? 
Are trainers able to convey the learning objectives? 
Did the hands-on work focus on the learning objectives? 
Gather actionable feedback - evaluations must be revised continually * 

The manufacturing-based approach to quality was and continues to be rooted in how well a product conforms to standards.An approach that ignores the "specification" will not result in a quality product. In the case of implementation, we'd have to see how well the material stood up to its intended purpose. The best vehicles for this test are 1) the learning objectives and 2) the hands-on material (labs). Other ways to make sure implementation goes accordingly is to make sure the trainers (if it's not you) are prepared. Finally, a cheaper, faster way to "test" implementation might be not to run the whole pilot but instead to take a chunk of the material and try it out before a major release. 


Poor Evaluation. The often ignored, youngest child in ADDIE. Evaluation can really bring everything back to the beginning by asking whether or not our specification - the learning objectives - were met? Why or why not? You can go back to iteration during design and development - did the design rely on feedback? How much? Is more needed? Was the design examined? Are we asking questions that actually tell us something about the content? Sometimes the comments are far more valuable than the smile sheets or ranking evaluations. 


Thursday, January 7, 2016

Quality I: Old School Quality Concerns

A few days ago, I had a little idea about quality and instructional design, and I figured I'd really need multiple posts to deal with it. Here's the first of the multiple.

Garvin (1984) dug through his literature and summarized five ideas about quality, all stemming from different fields including marketing, economics, operations management and even philosophy. His goal was (I think) to form a more holistic concept for quality as it pertained to products. 

What he came up with was a list of definitions for quality. There are five of them and he has all the proper citations in his paper, and since this blog is really for my own mind wanderings, I'm going to go ahead and leave the citations out for now. 

Transcendent Quality -  proponents of this view claim that quality cannot be defined precisely; rather, it is a simple, non-analyzable property that we learn to recognize only through experience. 

Product Quality - Product-based definitions are quite different; they view quality as a precise and measurable variable. Quality reflects the presence or absence of measurable product attributes (which are assigned a cost, when analyzed economically).

User-based Quality - individual consumers are assumed to have different wants or needs, and those goods that best satisfy their preferences are those that they regard as having the highest quality. Personalized quality - this is very true today (2016). 

Even perfectly objective characteristics, however, are open to varying interpretations. Today, durability is regarded as an important element of quality (1984).  

Manufacturing Quality -  “conformance to requirements.” Once a design or a specification has been established, any deviation implies a reduction in quality.
Quality is defined in a manner that simplifies engineering and production control.  Efforts to simplify and streamline ((which are equivalent to reductions in the number of deviations) exist solely to reduce costs. This type of quality is applicable perhaps in software quality. 

Value-based Quality - According to this view, a quality product is one that provides performance at an acceptable price or conformance at an acceptable cost

Clearly, the "transcendent quality" definition is useless to me (shout out to the philosophers in the crowd!).So let me just ignore that. I am also going to ignore value-based quality because it is really saying that "cheap things have quality value because they are cheap". I just can't see how it fits into the learning product idea in terms of material quality. I am going to ignore it. 

Moving on, product quality makes a lot of sense: presence or absence of a measurable product attribute. Some of the examples include how much butterfat is in a good ice cream, or how many knots per inch make up a good rug. These are measurable attributes. The attributes themselves are arguably subjective, but Garvin assumes that most people would know the attribute as it pertains to quality.

Both the user-based quality definition and the manufacturing definition could work for me in terms of setting up my "learning product" quality idea. The user-based quality definition speaks very much to the current trend of product customization, which has a lot of value. The common person can have customized learning, just as he or she can have customized drinks at Starbucks. 

While manufacturing quality seems like a silly thing to bring up here, it actually isn't, because people talk about technology products in a manufacturing way all the time. I'm not saying I like it, but people do talk about standards and conformance and consistency a lot. There are a lot of boring, repetitive tasks in technology that ideally should be automated, but sometimes are not, or fall into a type of "mechanical" work that is similar to manufacturing. Large-scale file changes, uploads, changes.Even automation is subject to a quality check. So it's not ridiculous to consider the "conformance to requirements" in at least one of the ADDIE steps...ah yes. I'm trying to transpose some of these ideas into ADDIE....Mwa ha ha ha.

So in summary: User-based Quality, Product Quality and Manufacturing Quality all have characteristics that help me hone in on a quality model for learning products. 

The next post will clarify further...


Wednesday, January 6, 2016

Information Mapping - Overview

A colleague of mine sent me a few interesting links, so I'm putting them here for reference and commenting later. When I have time.

Horn, R. E. Clarifying Two Controversies About Information Mapping's Method Accessed 01/06/2016

Horn, R.E. (1974).  Information Mapping, Training in Business and Industry 11 (3).

Parsons School of Design Information Mapping Papers

Tuesday, January 5, 2016

ID and Quality: A Prelude

What is quality?

When we buy eggs at the market, we don't want a broken one. When we order food at a restaurant, we want it to taste good. A high quality experience in economy class is when the flight attendant offers you a glass of complimentary wine just "because you look like you need it" (I recently had this experience, and it resulted in my writing a glowing satisfaction report for the flight). 

What is poor quality? 

Every egg carton has broken eggs. The food is too salty in the restaurant. The flight attendant was not warm and friendly. All have different root causes, but each of these can make or break a quality situation.

How do we define quality when it comes to instructional material? How do you know if what you have produced is high quality before you delivery the training?  

In order to answer these questions, I have decided to treat instructional material as a product. It is, in a sense, a product. It is painful to me, as both the daughter of academics and a trained academician, to admit this, but the plain fact of life in America is that everything can be productized. All our experiences are at some point commodified. This includes the very act of acquiring information. When you design instructional material, you are really designing a product. A learning product. 

Even though it annoys me, working on this assumption eases the search for answers because there is a mountain of literature and information about quality (but how good is it?). 

One seminal work that I'm going to start with is David Garvin's (1984) work titled, "What Does Product Quality Really Mean?". I am looking at this one closely because it is a review paper that summarizes ideas on quality (until 1984, obviously). One glaring point here is that in 1984, software quality was in its infancy. Therefore, software training was not even born yet. However, I'm going to argue that the ideas of quality, as a concept, are somewhat ageless, and that there is some hope that we can apply the ideas in Garvin's work to the present questions: What are the key attributes of high quality instructional material, and how can an instructional designer strive to product high quality material? 

My ultimate goal is to answer these two questions, and also produce a checklist that helps instructional designers determine whether or not they have met a basic quality standard. this is probably going to take more than one post!

References