Learning at Scale Slides from ICTCM

Mar 11, 2017 by

Learning at Scale: Using Research To Improve Learning Practices and Technology for Teaching Math

In the last 5 years, there has been a rise in what we might call “large-scale digital learning experiments.”  These take the form of centralized courses, vendor-created courseware, online homework systems, MOOCs, and free-range learning platforms. If we mine the research, successes, and failures coming out of these experiments, what can we discover about designing better digital learning experiences and technology for the learning of mathematics?

 

Possibly Related Posts:


read more

The Importance of Findability for Learners

Dec 16, 2016 by

How do you feel when you go to find information on a website, and you just can’t find it? This happens to me all the time when I want to find out what some new ed-tech wonder product does, and I visit the website and can’t see any screenshots, any descriptions, or any videos of the product in action. I find it incredibly frustrating and this story generally ends by me giving up on even signing up for a trial. The same thing happens to students when they go to find information and it is buried in a non-sensical place.

As everyone finishes a semester, and prepares documents and course shells for the next, it seems a good time to share this article, The Impact of Findability on Student Motivation, Self-Efficacy, and Perceptions of Online Course QualityWhile the research targeted online courses, many face-to-face courses are now accompanied by a myriad of resources that live in an LMS course shell and I think there are also implications for findability in course packets and syllabi as well.

For this article, one of the researchers, Dr. David Robins, User Experience Design professor at Kent State University, has presented the study in a webinar format available on YouTube. Their research question: What happens when students have trouble finding components of a course?


 

The researchers took two courses that were well-designed and passed Quality Matters standards, and then “broke” them in terms of findability. The broken courses still technically passed QM standards, but the components were harder to find. Students were asked to perform scenario-based tasks in the online courses.

Sidenote: If you’ve never seen a standard software usability test, here’s a nice “findability fail reel” for a mobile website with questionable usability.

I don’t think anyone will be surprised to find that poor findability correlated with decreased self-efficacy and decreased motivation. However, there was an interesting set of actionable findings regarding navigation and visual design that came from researchers watching participants attempt to navigate the courses. Consider looking for these types of things in your course or syllabus and then improving them:

  • navigation items that are not grouped into logical categories
  • poor labeling (e.g. using the file name instead of a true description)
  • poor categorization (e.g. placing an exam review under “Course Documents” instead of in the section labeled “Prepare for the Exam”)
  • deeply buried content (e.g. syllabus is buried four levels deep)

This article also got me thinking about whether the most important items of a syllabus might be presented in a more 21st-century-friendly manner. There is a whole rabbit hole of syllabi created as infographics on the Interwebs.

Probably your university is still going to want an old-fashioned text version, but maybe students could use more visual infographics for what I would consider the top-5 syllabus items of interest to students:

  • How is this course graded?
  • What are tests like?
  • Are there any projects or papers?
  • Do I have to attend class?
  • Is there group work?

As well as the additional syllabus items that instructors want them to know:

  • What are you going to learn?
  • Why should you care about what you are going to learn in this class?
  • How strict is this instructor on deadlines?
  • What is considered good/bad behavior in this class?
  • What are the instructor’s pet peeves? (come on, that’s a real thing and whole chapters of your syllabi get devoted to these issues)

Challenge: Take a fresh look at your syllabus and/or course shell. Assume that you do have findability issues and look for them. If you don’t think you have them, had over the questions above to a friend or family member and see how long it takes them to find the key components. Revise and improve the findability of important components to lower student frustration for the next semester.

Note: A weekly bite of learning design and a challenge goes out every week. If you’d like to have it delivered to your inbox, sign up at Weekly Teaching Challenge.

Reference:

Simunich, B., Robins, D. B., & Kelly, V. (2015). The Impact of Findability on Student Motivation, Self-Efficacy, and Perceptions of Online Course Quality. American Journal of Distance Education, 29(3), 174-185.


Possibly Related Posts:


read more

Why prototype a digital course?

Oct 11, 2016 by

Very few of us would buy an unbuilt home without at least viewing a model home that conveys the look and feel of the interior and exterior of the rest of the community. We should be unwilling to build (or buy) an entire course (a “row” of units, modules, chapters, or weeks of content) without seeing at least one “model unit” first.

craftsman-exterior

From http://www.houzz.com/photos/36213135

In the software world, a low-fidelity prototype is used to give the look and feel of a future product. With this prototype there is some hand-waving (mockups) to explain away missing functionality and potential users are asked how they would navigate and use the product. This happens long before the product build, and is iterative.

In the learning world, we should consider that course builds (especially large-scale digital courseware) need the same kind of prototype.  Before the time and money is invested to build the a full course, consider building one unit as completely as possible, and make sure your stakeholders (students, faculty, instructional designers, deans, customers) actually want to learn in this course.  Choose a prototype unit that is most representative of the majority of the learning in course; this is usually not the first or last unit.

When the model unit is being designed and built, this is the ideal time to collaborate iteratively with students, faculty, IT, assessment, and instructional designers. While it will take some time to change the model unit as opinions shift, it will not take as much time as remodeling every unit in the course.

After you’ve got stakeholder approval for the model unit design, make sure to carefully document what features this prototype contains, since your team will need to apply it consistently across the full development. Here are just a few of the learning features you might want to apply across your multi-unit build:

  • content: where did it come from? what quantity per learning objective?
  • examples: how often, how relevant?
  • interaction: how much, what kind, and how often?
  • assessment: what kind? how often? authentic? purely for practice? for learning scaffolding?
  • images: for what purpose, how often?
  • videos: how long are they, what stylistic elements are there, how often do they occur?
  • simulations or games: for what purpose? how often?

As digital learning becomes more accepted (thanks MOOCs) and blended learning becomes a more standard model at traditional institutions, I hope we’ll see much more collaborative prototyping, followed by intentional design, in these courses.

Possibly Related Posts:


read more

Canvas Guides for Math and Chemistry

Aug 17, 2013 by

For those of you gearing up for the Fall Semester, I want to make sure that you know that (1) Canvas upgraded the math editor recently and there is a lot of new functionality available and (2) there are print-friendly guides for using the Canvas Equation Editor. You can include these in your syllabus or as handouts if you are planning to do some intensive Canvas stuff with equations in the fall.

Basic Equation Tips (simple 1-page PDF)

Advanced Equation Tips (for using the new LaTeX feature to do matrices, tables of values, piecewise functions, and more)

Chemistry Tips (writing formulas, scientific notation, and chemical equations/reactions)

You can always find these links on the top of the How do I use the Math Editor? Canvas Guide.

Possibly Related Posts:


read more

Activity Icons for Online Course Design

Mar 20, 2013 by

When I went to build my Social Media MOOC, I wanted to find a great set of icons to visually clue students to the learning activities in the course. I looked high and low, but couldn’t find an icon set that had all the types of icons that I wanted (Discussions, Read, Write, Tweet, Watch, and Groupwork). In the end, I gave up and just hacked together my own icons.

If you’d like to add to the set, I used the IcoMoon App (and the free IcoMoon library), with hex color 2c5782 and 32 pixel heights.  These icon fonts were then layered onto a rounded button in Adobe Illustrator.

You can browse the course if you want to see how the icons are used. Here is an example:

If you’d like to use the course activity icons, you’re welcome to do so. Download the course activity icons zip file, which contains all 6 icons and the adobe illustrator file for the rounded square icon (in case you’d like to build your own).  I will take requests to build more icons for about a week and then create a second batch. Let me know if there’s a learning activity that you need an icon for in your online course. If you create some and would like to share them, that would be great! I’ll add them to the zip file for others to use.

Note that I am not a graphic designer, nor do I play one on TV. If you are a graphic designer, and want to make cleaner icons or different design sets of icons, I’d be happy to share or link to those too.

Possibly Related Posts:


read more

Video Code Easter Eggs

Jul 9, 2012 by

I have this sneaky trick I use to tell which students watch online videos and which don’t.  I hide “secret codes” in the videos (like the programmer’s Easter Eggs).  When a student finds one of my “Easter Egg codes” they can submit it for 1 point towards participation.  Sometimes the codes are numbers I generate at random (Ex: 40234) and sometimes it’s a word, phrase, or story (my cat is chasing a fly in front of my computer).

I don’t tell the students where the codes are. I don’t tell them HOW I’ve shared the code or what kind of code it is.  Some videos have codes, and many don’t.  Because of the random distribution of codes in videos, and my “loose” way of collecting them for points, I can always add or remove videos with codes, and it won’t affect the overall point system.

Here's an example of a video code inserted as a callout bubble. Click on image to enlarge.

Let me explain. Students are not required to watch particular videos and there are two other ways to earn participation points.  Participation for each unit is counted out of 10 points, but 5 extra credit points may also be earned.  Thus, there is a cap on the total number of points I will count.

Participation points can be earned by:

  1. Participating in a live online chat. (2 points)
  2. Posting something substantive in a Discussion. (1 point)
  3. Turning in a video code. (1 point)

Here are various ways that I hide the codes in videos:

  • In callout bubbles I add post-recording and pre-production
  • On calculator screens (sneaky, huh?)
  • In something I say out loud
  • In something I write on the journaling screen
  • In something I say and write on the journaling screen
  • In the text of a math equation
  • Underlining a particular word or phrase on the screen from a lesson
  • An action to take (call my office phone and sing the quotient rule to me)

Collecting the codes is the real trick.  Some years I’ve used a Google Spreadsheet or Doc for the collection.  This year, I’m using the comment field of the Canvas Graded Discussions.  I set up one Discussion for each unit, worth 10 points.  When a student participates in an online chat, I go to the gradebook for this Discussion and add a comment “Chat 7/9/12 = 2 points” for that student.  When the student submits a video code, I go to the gradebook for this Discussion and add the comment “Video Code 40234 = 1 point”.  Then when I go to grade the assignment, I see not only all the students’ discussion posts, but also all their collected codes and chat points (see image).

Canvas Discussions grading screen with comments. Click on image to enlarge.

It’s really interesting to see which students find and submit the codes and which students never submit a single code. This helps me to track the progression of students through a particular unit.  Pessimistically, it helps me to “catch” those students who claim they are watching videos when, in fact they aren’t.  But optimistically, I can also tell who consistently watches all the videos by seeing their collected codes pile up.  While I haven’t always enjoyed keeping lists of students and codes, the “Easter Egg” method has worked well over the years to keep track of video-watching as a way to participate in online courses.

Possibly Related Posts:


read more