Thursday, March 5, 2015

Action Research- Part 1

     I've recently performed a little "action research" in my physics classes. Wikipedia defines action research in the following way: "Action research involves actively participating in a change situation, often via an existing organization, whilst simultaneously conducting research". I wanted to get an idea of how successfully I could make videos which taught the concepts of a given unit. Therefore for the unit on "Waves", I made a series of 6 videos (they can be found at Tyephysics - find the Waves playlist) which students viewed in place of my traditional lectures. This wasn't an attempt to "flip " my classroom. My plan was to make only one change in order to see the effect(s) of that change alone (the old experimental controls and variables idea). It seemed to me that many teachers were making too many changes at once (learning through videos, flipping, extensive use of iPads/Chromebooks...), and therefore were reducing their chances of discerning what did and did not work. If I made all the aforementioned changes, and the students bombed the unit, how would I know what the issues were? Maybe all my changes except one were improvements, but one change caused all the problems. If the students were wildly successful, I would still have the same unanswered questions, but from the opposite perspective. Maybe only one change I made accounted for all the improvements, and the other changes either made no difference, or even dampened the overall success of the students. At best, this would mean that I was wasting time and possibly money on unnecessary changes.  So, my only objective was to have the students learn through the videos. Thus, they watched the videos in class (for the most part), not at home. I don't pretend that this "experiment" was set up to produce results that would hold up under high levels of professional scrutiny. A lot of the way in which I set this up and the way in which I analyzed the results was done by intuition. I don't have the background to produce higher level, scientifically verifiable results. However, I think that I was able to learn some things from the results.
     After the unit was completed, I gave the students a short survey to complete. Below are the questions and the results. All questions were scaled on a 1-10 basis.  A "1' meant that they Totally Disagreed with my survey statement. A "10" meant they Totally Agreed with my survey statement. A "5" meant they were Neutral in reference to the survey statement. They were free to pick any number from 1-10 based on their agreement level. Below, are the questions, along with the average of all student responses. I recognize the limitations of just calculating the mean. An average of "5", for example could be produced from all "5's", or from a mix "1's" and "10's". I did a visual inspection of the results to get a quick idea of the spread of the data, but didn't do much beyond that. Anyway, here are the questions and the mean score for each. I have some thoughts on the results, but I am going to save them for a future post. One reason is that, if anyone is interested, they can comment on their interpretation of the meaning of the results without me biasing your thoughts.  I also recognize that these may not have been the best possible set of questions. I already know one or two changes that I would make if I were to do this again. If you have questions that you think should have been asked, omitted, or changed, feel free to chime in. Lastly, let me tell you one more (possibly) important thing, namely how the students did on the unit test. This is  definitely not a statistic that I can vigorously defend, but test averages were up about 4 points or so. There are too many unaccounted for variables to say whether this means anything. I'll write about my thoughts on this in a future post.

Q1: I felt that the videos were a useful way to learn the ideas of this unit.             Mean: 6.9
Q2: I felt that the videos were a better way to learn the ideas than traditional        
       class lectures                                                                                            Mean: 5.4
Q3: I like that the videos were always available if I wanted to re-watch a
       segment that I didn't understand the first time I heard it.                                Mean:  8.4
Q4: I like that I could watch the videos if I was absent so I wouldn't get
       too far behind if I missed school.                                                                 Mean: 8.2
Q5: I would like the whole course to be taught this way.                                       Mean: 3.5
Q6: I like that I can move at my own pace through a video.                                  Mean: 7.5

A "Flipped Classroom" is one in which students watch videos at home and                                           work on homework problems in class. We did not do a flipped classroom in this                             unit, but we did learn through watching videos instead of the traditional lecture.                          Based on that,.....
Q7: I think that watching videos at home and working on problems in class                                                  (a flipped class) would be a better way to learn physics.                         Mean: 4.3
Q8: I think that a flipped class would be a useful way to learn in other subjects                                            (English, Math, Social Studies...).                                                         Mean 4.0
                                                                

Friday, February 27, 2015

Proctor Track

     So, a student takes an online course. How does the instructor know that the student is not cheating on tests, or if it is even the student himself (and not some smart friend) who is taking the tests? Up until now, there were several options. One was to ignore this inconvenient fact. Another was to have students take tests in front of a proctor. Some colleges have testing centers where students go to take online tests, for example. I had to take a content specialty test when I applied for the NY State Master Teacher program. It was done online at a testing center in Vestal, NY. The security was intense. I had to show my ID upon entering and exiting the test, and had to lift up my pant legs to show I didn't have anything hidden in them, as well as getting scanned with a metal detector. Also, a proctor continuously walked about the testing room during the exam.  Now, a new alternative exists called Proctor Track . It uses biometric software to check on online students through such things as facial and knuckle recognition (that's a new one to me), keystroke and mouse movement monitoring, and monitoring if a student leaves his computer during a test, or gets help from a friend during a test. There are other programs that do this as well, such as Proctor Free .
     As you might imagine, there are some who are protesting due to privacy concerns. Students at Rutgers were particularly upset when they learned that this technology would monitor their online courses. Although there may be some privacy issues to address, I don't think that students (or people in general) realize how little digital privacy they already really possess. For example, I just saw a interesting You Tube video from the channel "Vsauce" titled How People Disappear . In this video, the host tells of a father storming a Target store because they had been sending coupons to his high school aged daughter for things like cribs and diapers. He accused the store  of encouraging his daughter to get pregnant. Interestingly, a few weeks later, the father apologized to the store because, in fact, his daughter was actually already pregnant. The store, through the monitoring and analyzing of her previous purchases, had used their own algorithms to figure out that she was pregnant (she bought things like vitamin supplements and scent free soap) well before her family found out. With my own students, they have not caught on to how much I know about how they completed their homework when I give an assignment through EdPuzzle (see earlier post). I can see things such as exactly when they viewed the video (2AM?....right before class?...), and how many times they re-watched any given segments.
     Digital privacy is a huge issue and is much more complicated than can be addressed in one small post. At this point, I have to admit to feeling somewhat favorable towards the use of technologies such as Proctor Track. The bottom line is that, in order for online learning to be legitimate, there needs to be accountability. It will be interesting to see how this plays out as it becomes more and more commonly used by learning institutions.
   

Tuesday, February 24, 2015

Has Technology Changed the Way Students Think?

     I have just finished reading a book called "Digital Leadership : Changing Paradigms for Changing Times" by Eric Sheninger. He is one of the "hot" ed tech school gurus right now. He spoke at our school this past Fall during a professional development day. Scheninger is the former principal of New Milford High School in New Jersey, and is known as "The Twitter Principal". The back cover of his book states that his philosophy "...takes into account recent changes such as ubiquitous connectivity, open source technology, mobile devices, and personalization to dramatically shift how schools have been run and structured for over a century". While at New Milford, Sheninger pushed for a complete embrace of all things digital. For example, instead of banning Smart Phones, Sheninger encouraged teachers and students to use them as learning tools within their classes. He implemented a policy known as BYOD (Bring Your Own Device). For example, there is an app known as "Poll Anywhere"  which allows a teacher to ask students a question, and then allows the students to text in their answers. This app takes the place of the commercial "clickers" that teachers have used in the past to get responses from all students.  Sheninger's BYOD idea was ingenious in the sense that, while he wanted all student's to use digital technology tools in his school, he realized that the funds didn't exist to outfit each student with a laptop, Chromebook, or Ipad. He leveraged the fact that most students already possessed powerful digital tools in their phones. Any students who didn't own Smart Phones could be provided some type of digital tool by the school. As principal, Sheninger embraced technologies such as Twitter and Facebook to communicate with parents, teachers, and students. So, how did this all work out? In terms of student learning, it is tough to tell. When Sheninger spoke at our school, he claimed that student achievement at New Milford had improved dramatically, but he didn't provide many details to back up his claims.
      This somewhat mirrors the general ed tech trend. There is not much hard data which shows that digital technologies dramatically improve educational outcomes. One problem I see is that some of the ideas embraced by the leaders in the ed tech movement are somewhat questionable. Here are a couple from Sheninger's book that I question:

" Digital learners prefer instant gratification and immediate rewards, but many educators prefer deferred gratification and delayed rewards"
"Digital learners prefer parallel processing and multitasking, but many educators prefer linear processing and single tasks or limited multitasking".

Let's look at the first of Sheninger's tenets. Just because digital learners prefer instant gratification doesn't mean that this is a desirable trait. In fact, if this is true, it is likely a mark against digital learning. For one, it ignores very well established psychology findings which show that the ability to delay gratification is an important predictor of academic success.  Stanford University's famous "Marshmallow Test" experiment was the first and most famous example illustrating this phenomena. In this experiment, children were given a marshmallow and told that they could eat it right away, but, if they could wait a while to eat it, they would get a second marshmallow. Thus, it was a test of a young child's ability to delay gratification in order to obtain a greater reward in the future. Subsequent follow-up research showed that there was a high correlation between a child's ability to exhibit self control (by not eating the marshmallow right away) and his/her subsequent success in school. Although some have questioned these findings, they correlate highly with my own anecdotal experiences. Here is a fun link which recreates the original experiment.

The second of Sheninger's tenets refers back to the title of this blog. According to Cognitive Psychologist,  Daniel Willingham, the answer is "no". In the article referenced by the preceding link, Willingham states "Is multitasking a good idea? Most of the time, no. One of the most stubborn, persistent phenomena of the mind is that when you do two things at once, you don't do either one as well as when you do them one at a time". He adds that research shows "...even simple tasks show a cost in the speed and accuracy with which we perform them when doubled up with another, equally simple task". Willingham has a lot more to say in this article, and it is well worth reading.

So, what to make of all this? As with all things "new", it is often tempting to believe that some of the traditional, perennial problems (students preferring instant gratification and not wanting to concentrate, for example) in teaching are artifacts of the mode of instruction rather than inherent challenges related to the teaching  of children. I've seen this happen throughout my teaching career. If technology is going to make a meaningful improvement in education, its proponents will have to maintain a consistently sober perspective on what technology can and cannot do. Technology most likely will have negative as well as positive effects in the classroom. Discerning between the two is an imperative.

Wednesday, February 18, 2015

Willingham on Smart Phones

My favorite cognitive psychologist (doesn't everyone have one?), Daniel Willingham, recently had an op-ed piece in the NY Times about Smart Phones and their effect on attention spans. I've read a lot of his stuff (books and articles) and his writing style is both accessible and thorough. He has recently taken to writing a lot about education issues.

Monday, February 16, 2015

EdPuzzle

EdPuzzle is a technology app that I have been waiting for for a long time. It allows me to upload a video (from YouTube or anywhere else- my videos, or someone elses) and add questions that a student has to answer while watching the video. It also has a "no skip" feature that will not allow students to skip ahead while watching the video. Also, they log in with a preassigned code which sends me the results of their efforts. I not only get their results, I also know exactly the time they did the assignment, and how many times they watched different segments of the video .They can re-watch earlier segments of the video if they want to find out missed information for a question. This gives me a sense of the effort they put in to their assignment. One of the main hurdles that I have historically had with the "flipped class" model is that there didn't seem to be a way to determine if students actually watched a video. This app seems to significantly diminish that problem. In fact, it may actually be an improvement to the content presentation portion of a lesson as it is currently done, at least as it relates to the "slackers". As it now stands, it is relatively easy for slacker students to sit in a class, be looking at the teacher, be writing things down, but not really be paying much attention. As a teacher, I try to over come this problem by asking questions and otherwise engaging students during my presentations. However, if I have 20 students in a class, a student has, at least in theory, only 1/20th of a chance of being called on to answer any given question. Also, when called upon, they can feign effort by giving me an "answer", which is really just a random guess. It is usually tough to call students out on this type of thing. With EdPuzzle, all students have to answer all questions while being introduced to new content. The questions I embed in the video should be relatively easy to answer, provided the student is paying attention. I would reserve the deeper, more subtle questions for actual class time when I can discuss the issue in a more in depth manner. The EdPuzzle portion would used to give students baseline content within a unit.  I guess there would be ways to "cheat" the system, but none of the methods I can think of would be any less effort than just watching the 10-15 minute video. Also, it is currently easier to copy a fellow student's homework than it would be to find some way to circumvent EdPuzzle. I've tried EdPuzzle a bit already, and have been happy with the results. I'll have to continue to think through how this might influence the structure of my class, but I see EdPuzzle as a possibly very useful accountability tool. Here is a link to an EdPuzzle YouTube video series.

Sunday, February 8, 2015

Godzilla vs King Kong vs Megalon : Google, Microsoft, and Apple Fight for the Public School Market

Note 1: This post is not meant to draw a value judgment on the usefulness of any of these technologies (yet). I am merely trying to summarize the situation as I currently see it
Note 2: All students in my son's 7th grade class have just received iPads. All the students in his high school already have received iPads. Also, I was at a teacher's meeting recently and a teacher from Ithaca High School told us that all students at his high school have just received Chromebooks.

    Three mammoth players in the computer world are currently engaging in a giant battle to position themselves as the dominant player in the enormous education market. Apple was the early leader in terms of infiltration into the classroom around 2010 with the development of their iPad. The idea was that the iPad would replace the traditional textbook. As described by Apple itself,  iPads would be more portable, durable, interactive, searchable, and current than the traditional text. Very quickly, schools hopped on board with massive investments to put the Ipad in the classroom. A 2011 NY Times Article describes a common occurrence: "As part of a pilot program, Roslyn High School on Long Island handed out 47 iPads on Dec. 20 to the students and teachers in two humanities classes. The school district hopes to provide iPads eventually to all 1,100 of its students". In 2012, Apple dove in deeper. As reported in Venturebeat,  "Apple announced iBooks 2, an updated iPhone and iPad app that will offer highly interactive electronic textbooks, as well as a new textbook section in the iBookstore. The company also showed off a new version of iTunes U, which gives teachers the ability to do much more than create lectures for download".    According to Apple, "...iTunes U, (is) a service that lets students download lectures and other materials from iTunes. Cue says Apple has seen over 700 million downloads from iTunes U, and that it has mostly been used for"over 20,000 education apps for the iPad". According to Apple, there are currently "1.5 million iPads being used in education". 

    So, it is no surprise that Google and Microsoft are also interested the education market. From my own perspective, both of these other giants have really made inroads within the past two years, but even more so within the past one year (especially Google). They have taken a different tact than Apple, however. Both have put their efforts into more of a Cloud based approach (which makes sense, especially for Google). Google struck first with the development of the Chromebook in 2011. This is a laptop which operates exclusively in the Cloud through the Google Chrome operating system. It is currently available for around $200 (less than half the price of an iPad Air 2). Whereas iPads are meant to be dedicated to a single student who downloads all his own apps and textbooks, the Chromebook can be kept in a single classroom, and students from each class can sign on through their Google accounts. Therefore, the teacher can use the same class set of machines throughout the day for all his classes. Because they are not bogged down with a lot of programs, Chromebooks boot up incredibly quickly. Also, since everything is on the Cloud, students can access their materials from any computer.  Google has also developed a large assortment of their own apps for education. The whole "apps" thing was one major reason that iPads had the most early success penetrating into the classroom. Google couldn't originally match all the iPad apps, That is now beginning to change. Currently, public schools can become "Google Schools" and use all the "Google Apps for Education". As a teacher, I can set up my own "Google Classroom" using my Google (gmail) account on my "Google Drive". According to Wikipedia, there were 400,000 Chromebooks sold in 2012, which jumped to 1.76 million by 2013, According to other sources, there were 5.2 million Chromebooks sold in 2014. As of September, 2014 Google claims that 30 million students and teachers are using Google Apps for education.
   Microsoft has jumped into the fray with its Chromebook alternative, the Stream 13. It runs Windows 8.1, and comes in at around $200. They are doing this because, according to Computer World, , Apple and Google have eaten into  "...Windows PC sales, which dropped about 6%, slipping from a 72.3% share in 2013 to 68.4% in 2014."  If they buy a Stream 13, the general public gets one free year of Office 365 (a subscription based version of Microsoft Office 2013). Schools and students, however, get unlimited use of Office 365. With the Stream 13 and Office 365, Microsoft is hoping to capitalize on some of Chromebook's weaknesses. Chromebooks operate almost completely online, while the Stream 13 is a "software plus" machine, which combines Cloud capabilities (cloud storage on OneDrive) with locally running software. One advantage of this is that the number of apps available is greater than the Chromebooks can currently offer. Also, with the Stream, you can work offline.  One of the common complaints  I've heard about the Stream, however, is that the processing power that comes with a $200 price tag means that it gets bogged down if pushed very hard. Also, according a reviewer at laptopmag.com, a Chromebook boots up in 7-8 seconds while a Stream can take over 30 seconds. This may not seem like much, but (in my opinion) if a teacher wants to use one of these devices during the middle of a class, that time difference can be significant. One feature I currently use on my desktop version of Office 365 is OneNote, which is more of a free form note taking app. I find it useful in presenting information and notes to my classes.  
     I wonder if what we are seeing is a replay of the SONY versus Betamax battle at the beginning of the home video player  revolution?