Learning Goal #4:

Demonstrates Critical, Reflective and Metacognitve Thinking

-- Reflects on their own design processes by discussing how they will use “lessons learned” in their future design endeavors.

-- Analyzes how the processes used in creating various artifacts has contributed to their own development as an ID&LT professional.

-- Connects design decisions and other professional practices within the ID&LT program to their own emerging philosophy surrounding issues in the field.

-- References

Printable version

Lessons Learned in the Design Process

When I design a web page or the materials for a class, I look at the information I need to convey from the user’s point of view. In other words, how would I want to learn this information? A theme in both the IT 500 class and the IT 580 class has been to include the user in the design experience. To me, designing educational materials without considering who the user is or how they intend to use the materials is like planting a garden without knowing how tall the plants are going to be or how much sun and water the plants would need.

IT 500 was definitely a departure from how I traditionally thing about design. When I design, I usually identify my target audience, identify their needs, and create the instruction to fit those needs. In IT 500, I covered a lot of learning theories and strategies that I have never considered, such as Constructivism and many that I have used, but did not really know what it was called such as Behaviorism.  While most of my training has always been with adults (age 18 and above), I knew that adult learners had different needs and expectations than children. I also knew that these adult learners were coming to me as a trainer because they had a need to learn something specific. I just did not know it was called situated cognition. What IT 500 taught me was to think outside of the box. I may have my learners for an hour training session, but I can still use authentic tasks to enable learning. I can still use a problem-based learning opportunity to create more than just a button-pushing training. Although I am not training now, I can use these theories and strategies in future training and in the web-based and handout based tutorials that I still create.

Select a theory or strategy

An important lesson that I learned from IT 500 was that, while there are a lot of theories and strategies out there, I do not have to stick to just one. While training is often considered Behavioristic, I can still use Constructivist activities to make the learning more meaningful. While the learners still need to know what buttons to push and what steps to take to use a piece of software, I can still work in problem-solving activities that will help develop their troubleshooting skills and help them in future situations.

In IT 580, the class preformed an experiment where a “teacher” would create a design using 6 basic household items and would have to instruct the “student” how to create the same design. In the first group, only the teacher could talk. In the second group, the teacher could talk and the student could ask questions. In the third group, the teacher and student could talk, but the teacher could also watch what the student is doing and make corrections to their instructions. The third group made fewer mistakes because the student was not only able to ask questions but the teacher could watch the student and anticipate any mistakes and make corrections quickly. Experiment data and final paper.

While the results were what I expected, what I learned during this experiment was the interaction theories of the Gulf of Execution and the Gulf of Evaluation. (2007) The Gulf of Execution is the difference between what the designer thinks the person should know and the action that is actually taken by the person. This is critical because we cannot always watch our learners while they use our instructional materials. I can assume the user knows how to double-click a mouse button and when to single click and when to double-click. When a designer assumes things about the user and the user doesn’t know these things, the user gets confused and mistakes are made, such as with groups one and two of the experiment. In one case, the teacher assumed the student would place the scissors on the right side because the teacher was right handed.

The Gulf of Evaluation is the understanding of what the designer explains to the user as how to use the system, or how well the user can figure out how to use the system from the design. When I design tutorials, I assume that users know where to click a mouse button or the submit button. While most of the users will be able to follow written instructions, having video tutorials where the user has to actually click the button to advance the tutorial will help the user understand the lesson better. In IT 596, I created a tutorial using Captivate and the training format where the user has to type or click to advance the tutorial. This method, over the standard “watch the video and try it yourself later” would force users to actually make the moves to continue. This method would eliminate some of the over looked steps that a user might miss if reading or watching the instructions.

I believe the reason that the third group was so successful was because the student could ask for clarification when necessary and the teacher could observe. The Gulf of Evaluation and Gulf of Execution was minimal. This was an important lesson to learn for me. I’ve always taken into account the audience when designing tutorials or training sessions, but I can’t make any assumptions about what I believe the user knows and what they actually know. Face-to-face training works best when needing to address special needs of the learners but face-to-face sessions are not always practical. Having specific instructions and images in the print tutorials and using interactive video with some verbal explanation with the video tutorials will help close the Gap of Execution. Keeping the design and instructions simple should help close the Gap of Evaluation so that users can concentrate on the learning and not the usage of the materials.

Improve the Design

Another lesson I learned was that a design could always be improved. Improving design is often overlooked by many designers either because of time limitations or work load. Evaluating and improving the design is a common thread in many design theories. In Edelson’s Design Research theory, Edelson (2002), he identified a student’s need, designed software to meet that need, evaluated how his students used the software and redesigned the software to meet the student’s needs. His theory stresses that design is fluid and should be constantly analyzed and redesigned to meet the learners needs. Design should not become stag net because as learners changes, so do their needs.

In IT 510, we worked on our design projects and had multiple opportunities for our classmates to review our work. For example, during one of the check points, Dr. Knowlton, chose my project to emphasize several key points that many in the class had also missed.  I was not being specific enough in my task analysis. I will admit because I did have some knowledge of the subject I left off some important steps in the tasks. It took another set of eyes to realize that I was missing something. If another designer was to pick up my design, those missing items would confuse the designer and perhaps stall the project.

Jannette Collins' article on "Education Techniques for Lifelong Learning: Principles of Adult Learning" (Collins, 2004) was very insightful to me because the learners that I work with, both students and faculty, are all adult learners. Adult learners have different needs as learners and bring prior learning and experiences to the classroom that younger learners do not bring. Collins (2004) refers to the KWL strategy, finding out what the learner KNOWs, what the learner WANTS to know and, at the end of the session what the learner LEARNED in the session. Surveying the students ahead of time to find out what they know, what they want to learn would help set the agenda for the class. Likewise, teaching something that the learner already understands is a waste of time and can frustrate the learner. A survey before class begins or before they attend the class would help me prepare the materials and subjects to cover. This is important to me, because I have always assumed that I know what the students need to learn. Each training session is different because the needs of the learners are different. So like Edelson's software design, Collins incorporates the learners in the design process in order to  improve the design for the sake of the learners.

Creating Artifacts

Creating artifacts, or multimedia, is what I do as part of my job. It is my specialty, or my niche, in the department.  When I started this program, I selected the Interactive Multimedia Technologies as my program because I wanted to learn more about designing multimedia for courses and how multimedia fits into a course design. I have not been disappointed. While several of the classes have been based on theory, I have had the opportunity to experiment and create several multimedia artifacts for use in my job.

The most notable artifact I created was a training tutorial using Captivate for IT 596. I have used Captivate to create video demonstration tutorials before where the viewer watches a screen capture of the steps involved in the tutorial. The area I wanted to explore was the training feature where the viewer would have to click but buttons on the screen to advance the tutorial. From IT 500, I learned that making a task authentic leads to learning especially for adults. Actually doing the task on the tutorial mimics the hands-on training that the department offers but as a just-in-time option of a pre-recorded video.

I was able to take a training video one step further in IT 597. I created a series of tutorials for the Image software, Gallery. While in IT 596 I created demonstration tutorials, in this Studio 2, I wanted to take them a step further and include an assessment feature. At the end of each tutorial I included a “Quick Quiz” that feature a question relating to a fact mentioned in the tutorial. The quizzing feature in Captivate was new to me and I struggled with the options of correct answer and incorrect answer feedback. These artifacts are going to add additional learning tools to the Gallery webpage to complement the print tutorial instructions.

Another artifact that I created was a web page in IT 486 for the purpose of determining what type of audio and/or video recordings could be made for online lectures. My page was more informational than instructional but in the process of taking the course, I learned the difference and adjusted part of the website to be more a step-by-step instruction rather than information. Having participated in the ID&LT program, I will probably redesign this page before presenting it for approval to my department.  While the design is user-centric, and has a lot of useful information, more attention should be given to the step-by-step process and the evaluation of the usefulness of the item being created. Just creating the artifact is not enough. As noted earlier, the artifact should be evaluated by peers and by users to determine the accuracy of the content, but also the usefulness of the information included in the website.

My Emerging Philosophy

In my Jury 1, I wrote that the IT & LD program has not changed my philosophy but reinforced my philosophy and decisions I make are the correct ones. While much of what I have learned does reinforce my philosophy, I think I have fine-tuned my understanding of my philosophy better. As my job duties have increased, I find that I must amend my philosophy. When I began this program, I was designing training materials and tutorials. My duties also include instructional design for online courses and course management.

The design theories and artifacts that I have learned and created so far will be helpful in my job as a trainer and instructional designer here at SIUE. I believe that we should design for the user’s benefit, not what is easiest for us as the designer.  Usability has been a theme in many of the reading that I have read as part of my course work. Whether it is to design simplified instructions for the technology novice or working with first-time online instructors, the designs must be user-centric and grounded in learning theories.

Much of what I have read and studied has emphasized the importance of the instructor-student relationship. I have always understood that the learner is critical in the success in any design. You cannot effectively design without knowing who your learners are, why they are taking your course/training/lesson. Their motivation (learning new skills or taking a required course), their background (traditional, online learner), their experience (techno-savvy, first time user), and their purpose (required or elective course) will all play a part in the learner’s attitude. In my work, the learners are also faculty. I not only have to understand the faculty’s motivation, background, experience and purpose for taking my training or ID advise, I must also convey to them, the importance of these characteristics of their students when designing their course. Michael Moore advises instructional designers and faculty “It is very dangerous to proceed on generalizations because assumptions are then made that may be quite erroneous” (Moore, p. 171). I can’t assume that everyone understands and is comfortable with the technology like I am. Not everyone can spend the time on design or the course like I do. Not everyone is as passionate about instructional design like I am!

I also believe that while the media I use can affect learning, it’s how I use the media that really influences learning. Reading John Cradler’s article “How Does Technology Influence Student Learning” (2002), one quote summarizing my beliefs that “students may manipulate simulation and presentation software to create a visual artifact without really understanding or applying sound conceptual thinking. The role of teachers is paramount in guiding the development of student’s higher-order thinking skills during learning activities involving technology tools” (Cradler, et al., p. 48).  As my duties include instructional design and course management, I help faculty pick the right tool and appropriate technique to help students reach that higher order thinking skills. For example, tests are one way of assessing knowledge. Tests could be used to assess factual knowledge, or tests can be used to apply what students have read in the text or heard in the lecture in a higher order of thinking, e.g., apply a concept to a case study. Another tool to be used for higher-order thinking skills are the Journal or Blog tool. Having the student reflect on concepts can lead the student to make conclusions about what they have learned. Students have the ability to review journal or blog entries throughout the semester and reflect upon the course as a whole, rather than individual chapters or the concept of the week.

Gary Morrison (1994) states in his analysis of Kozma and Clarks arguments that “it seems more productive to consider the effectiveness of the whole unit of instruction rather than the individual components” (Morrison, p. 201).  With training and tutorials, understanding why you take the steps is just as important to how to make the steps in order to be able to reach a greater level of understanding of what you are leaning. In the example above, being able to reflect on a semester’s worth of journal or blog entries can allow the students to “get the bigger picture” more effectively than expecting students to read through 16 weeks of Word documents for a summary of the course. Rarely are students asked to reflect on the concepts of the entire course, probably because sorting through multiple Word documents would be tedious and unproductive.

In the past, because of time restrains and work load, I have not regularly evaluated my tutorials unless someone notices a misspelling or error. Our department is just beginning to evaluate our training sessions. I never ask users what they have learned from the sessions or if the tutorials are helpful. I do occasionally get comments that the tutorials are great or useless, but due to time restrains and work load, I don’t often follow up on those comment. From reading about and learning about the importance of usability and improvement, the areas of evaluation and re-design of our training and tutorials is goal that I will have to set for myself. Another goal would be to develop a user analysis to find out who my users are (other than faculty) and how they use the tutorials. Do they print them out or follow the instructions from the computer screen? Do they need any additional information, such as course management and student user ability or is there too much information and they only want step-by-step?

Because we are in the process of developing our instructional design team’s goals and strategic plan, I am in a great position to assist in implementing many of the theories and practices I have learned in this Instructional Technologies program. One goal I would like to include a follow-up session with the faculty we advise. Right now, we meet with faculty for about an hour to discuss tools and options for the online class. Another goal for I would like to develop is an evaluation process where we meet with the faculty again after they have had time to work on their syllabus and course shell to offer continued support and advice. The evaluation process will continue during the first time the course is offered and again after the course is completed. Mid-semester course and teacher evaluations will be critical for the instructor to make adjustments to the course as it happens. End of semester evaluations will be critical to determine if all the course objectives have been successfully met. The instructional design team should be included in this process to assist the faculty with issues or support until the faculty is comfortable teaching online.

References:

(2007). Gulf of Evaluation and Gulf of Execution. Retrieved 12 October 2012 from http://www.interaction-design.org/encyclopedia/gulf_of_evaluation_and_gulf_of_execution.html

Collins, J. (2004). Education techniques for lifelong learning: Principles of adult learning. RadioGraphics, 24(5) 1483-1489.

Cradler, J., McNabb, M., Freeman, M., and Burchett, R. (2002). How Does Technology Influence Student Learning. Learning & Leading with Technology, 28(8), p. 46-56.

Edelson, D. C. (2002). Design research: What we learn when we engage in design. The Journal of the Learning Sciences, 11(1), 105-121.

Million, Laura. Assignment in IT 596, (2011), http://www.siue.edu/its/bb/multimedia/Tii_create_assignment2.swf

Million, Laura. Assignment in IT 597, (2012), http://www.siue.edu/its/bb/multimedia/gallery_upload1image.swf

Moore, M. and Kearsley, G. (2011), Distance Education: A Systems View of Online Learning (3rd  edition). Wadsworth Publishing.

Morrison, G. An Analysis of Kozma and Clark’s Arguments. Chapter 10. Text unknown. p. 199-204. Originally published as: Morrison, G. R. (1994). The media effects question: “Unresolvable” or asking the right question? Educational Technology Research and Development, 42(2), 41-44.

 

© 2013 - Laura Million
Last Updated: