(I want to apologize for making this post extremely long, I really loved this week’s readings, and they all tugged at my “teacher heart strings”, as I went through them, feelings of hope and despair popped up and excessive self-reflection took place. I decided in keeping with Joseph Ugoretz’s implied philosophy on the use of discussion boards, that it would still be ok to post it, but again, sorry about the length, I will try to keep it short the next time)

I especially appreciated Warner’s piece, “The Problem of Technology Hype” and Waters’s piece, “Hippocratic Oath for Ed Tech”. Often in the excitement about technological solutions for learning, the problems (e.g., the failures of adaptive instruction, the need for compassion, etc.) identified by these authors are ignored although they are very real to instructors who attempt to carry out these tech-infused learning processes.

What pedagogical opportunities does the integration of technology into the classroom make possible?

I have an ongoing and very real love-hate relationship with ed-tech. I love it enough to say that it I would rather not teach without the used of technology (I have done so but I’d rather not). Technology gives educators huge opportunities for developing and implementing differentiation and scaffolding strategies (assuming there are enough components that the teacher can customize). That said, it does not necessarily make this process more efficient since not every teacher or every student may be able to figure out how to use specific more technologies more quickly than the time it may take to learn a given skill without the aid of tech tools ; however, it certainly offers robust opportunities to implement these strategies, potentially at scale. Technology can offer more opportunities for scholars, students and teachers to share and collaborate across vast distances and this would not be feasible without technology.

The first time I gained access to smartboards and later ipads as a teacher, I was teaching all core subjects to students classified as “intellectually disabled” in a self-contained special education class. Prior to that I was working with a blackboard and chalk and very old textbooks that were way beyond the reading level of the students they were meant to be used by. The majority of my students were functioning at a pre-k to 3rd grade reading level, while only 2 students functioned at 5th to 6th grade levels. Technology allowed me to make stories/books from higher grade levels more accessible to students functioning at lower grade levels. For instance I used a software that allowed me utilize text to speech features as well as automatically highlighting words or phrases as they were read aloud by the software, and adjusting the reading speeds of the computer. All of this made it easier for me to teach literacy to struggling readers and improved engagement and retention of students’ word recognition and sequencing skills but this was due to my understanding of my students’ needs more than it was due to the features themselves. Technology can help us help students by allowing teachers to tailor instruction through the use of various software features. This helps to encourage student participation/motivation but the key here is that the technology should make it easier for the teacher to tailor the instruction because the software itself cannot effectively tailor anything because algorithms cannot get to know a student but can only memorize responses and plug them into another formula to pop out whatever was programmed into the function. A given X will always yield Y because Y depends on X but the fact is that learning is not so simple. Student understanding cannot be elucidated by a pre-programmed formula. An algorithm cannot customize learning, only a human teacher can do that.

What challenges does technology create for the student, the instructor, the institution?

Software cannot replace teachers, at least not for typical students (perhaps for the few self-motivated autodidacts who actually finish MOOC courses). Warner points out: ” Is repeating the same presentation over and over again to confused students a pedagogical practice we would accept in human instructors? …What is it we think students may ‘learn’ in theses systems?” (Warner, p. 92). While many technologies can be customized or attempt to customize instruction, individualizing instruction and reaching different types of learners is not something a computer program is capable of in my opinion. At best, an “adaptive learning algorithms” can reach a certain type of student, typically not students that would qualify as “outliers” on the normal curve that the algorithm is likely aimed to instruct at; regardless of how much data is collected and fed into a machine learning system, the outlier data points are unlikely to be used to connect with a lesson tailored to outlier students. For a student who has misunderstood, repeating the same lesson word for word until they get the right answer is not helpful in general. As any human teacher knows, when a student fails to understand the lesson, you do not feed the student the lesson in the same exact way on your next iteration. If we consider the student who is an English Language Learner, a simple phrase can throw off the understanding by so much and there is no way that the system can know that what the student misunderstood was based on English language fluency or vocabulary that may be unrelated to the specific lesson. Technology can be a very valuable tool but it is nothing more than a tool, it cannot sit in for a teacher. Treating “adaptive learning software” which aims to use algorithms to individualize instruction, as if it were equal to a teacher who is capable of learning about their students as individual humans, is highly problematic.

Optimizing learning sounds very appealing at first glance because often there is not always time to learn or teach all of the tech skills when one is following a strict curriculum. While technology is meant to optimize everything and may indeed make it more possible for teachers to individualize learning for a certain type of student, all of this takes time. It takes time to tailor assignments for different students, it takes time to teach students how to work with new technologies, and it takes teachers time to learn new technologies as well that a school might suddenly require teachers to use. Generally, true individualization of instruction is not reflected in the “adaptive software”. For instance, there is usually no room to digress in a productive way. Any digression is considered a mistake by an algorithm but teachers can use mistakes and digressions as teachable moments to promote depth of learning as noted in the “Two Roads Diverged in a Wood” article giving space for tangents in a lesson have learning value, “by allowing or even encouraging digression—by permitting students to take the “road not taken”—instructors facilitate a process whereby students may make new and original connections arising from their own thinking and discovery processes” (Ugoretz, p.2). These “teachable moments” are not recognize by algorithms and yet they have great value in student motivation as well as the ability to encourage students to make connections that lead to deeper understanding of content. The asynchronous discussion board can offer much more “personalization” than an “adaptive software” can because it is open and allows mental wandering that is often part of the learning process, especially when students encounter new topics.

What is reflected in adaptive learning software is the “oops you did it wrong again, a now try learning it again in exactly the same way”. This process is painful for students who need real instruction and have a low frustration threshold. For students who have disabilities or traumas, the algorithm can be downright abusive. I have been forced to use math software when I taught middle students students with learning disabilities that had this effect. After about 5 iterations of the “oops your wrong, watch this same video again and try again” the student would put their head down and cry or act out. What is eve more sad is that the student knew another way to get the math correct but the system would only accept its one defined way of demonstrating a skill based aligned with soon to be entirely phased out core standards (in my opinion it was a less useful way of doing fractions and division). What is worse is that this “math lab class” was meant for students with learning impairments and I was not supposed to teach it, I was supposed to just supervise the student interacting with the system (i.e., make sure they stay on the program that the school has spent its money on rather than going online to play video games, etc.).

Another issue with these formulaic attempts at instruction is that the systems can also be hacked. For instance, recently my step-son who is in HS Algebra shared with me that he had figured out a way to interact with one of the math softwares he has to use where he was able trick the system into believing he completed all of the assignments and therefore giving him all of the correct answers which he then used to get a perfect score. He then shared this trick with other students. The possibility to hack came from “repeated attempts” after “failing the first few times”. I am not sure how he did it but he exploited something in the part of the system that aims to “adapt instruction” by simply repeating the same exact lesson.

A district-mandated instructional software that is definitely not designed with your actual students’ in mind can be a major waste of money and class time and can even have detrimental effects on students’ motivation.

I think that schools will need to develop more diverse positions focused on instructional technology- having a computer teacher or an IT person is insufficient to keep up how quickly educational software is developing and how customizable some of it has to become. Generally, it is not teachers who decide what software a district should purchase. The people making those decisions are usually admins that are not currently teaching any classes (at least in k12). If core content teachers are allowed to take the lead in identifying resources and training each other, I think it would make for a better institutional outcomes. I emphasize core content teachers because they are the ones whose students will be tested most and whose data will be most utilized.

How do we understand the politics of educational technology that is both a field of inquiry and an industry?

The ever-intimate relationship between tech companies and political players increasingly leaves out the values of actual instruction.

The IT industry is focused on optimization and profit far more than effective instruction. The politicians are focused on having the numbers make them look good (i.e., consider how effective Bloomberg was at increasing passing rates by decreasing the minimum score to qualify as passing). There is a difference between learning and teaching that one cannot appreciate unless one has significant experience in both. It is very easy to feel that you understand “teaching” just because you are good at learning (particularly if you are a capable & self-motivated learner) but it is another story to actually teach others who are not so great at this and require significant guidance. Just because you are good at learning yourself it does not mean that you understand what schools or students need (hence the failure of initiatives by Bill Gates to fix education in NYC for instance ). One thing that seems to be overlooked where educational politics meets technological solutions to complex problems, despite it being obvious is that, as that “computers, no matter how powerful their algorithms, can only count” (Warner, p. 97). I think that this fact is often ignored perhaps because the idea of “optimization” is so alluring.

Generally, the politics associated with educational technology pulls us away from the depth of learning and teaching and toward the a sort of misguided efficiency or optimization relying on technological solutions which are not aligned with how students actually learn. I do not understand why wealthy IT moguls are treated as educational experts; they are not. They are simply people with money to spend who happen to be good at learning for their own purposes. Even if their intentions are pure, what qualifies Zuckerberg or Gates as education experts?

There are also costs to taking the time to teach technology skills not explicitly considered a part of the curriculum. Ensuring access to and accessibility of technology is also very challenging. I do not think that the technology companies appreciate that there are students that may struggle to use basic technologies (e.g., I once had to teach a “normal kid” who was HS senior how to use email in 2015!). Even when websites try to be accessible, they are not accessible to all people. Even the guidelines for accessibility themselves are not really accessible to everyone (https://www.w3.org/WAI/standards-guidelines/wcag/). It’s hard (maybe impossible) to have a set of guides that work for all students because multiple things affect how we interact with technology based on who were are and how we perceive and interpret information.

How do we locate our own values within all of this? As teachers we value learning and as technofiles we appreciate what technology has the potential to do but we must ask ourselves “is it necessary” and “is it customizable for the teacher?” and just as importantly “is there time?” . I do not think that educator values are used as much as financial values when decisions are made about which software a school must use. Everyone wants the fastest way to show the best numbers even when the desired numbers may not be accurate representations of learning.

We want to use technology to make learning “bigger, faster, stronger” but the reality of the human mind is not this way. There are too many differences and learning itself is not linear. If the tech industry could appreciate the “non-linear” nature of the learning experience and be less motivated by profit and more motivated by making learning a positive experience for students (rather than using data aggregation to make correlations and treat them as causations), perhaps better software could come of it but optimization must be treated as less important than the experience of learning itself. I think that teachers value technology but understand its limits in instruction and it will take experienced teachers, not experienced IT moguls, to make software that reflects the values of learning and teaching.

Audrey Waters considers a Hippocratic Oath for ed tech ethics might ” insist that students be recognized as humans, not as data points. It would demand a respect for student privacy… recognize that ‘the tools’ are less important than compassion… It could call for more professional transparency perhaps …open disclosure about relationships with industry”. I love this idea but I do not think that it is considered much when technologies are developed. I hope that such an oath would be required of developers/ IT companies pushing their software onto schools and not just educators as teachers themselves have limited power when it comes to decisions that the district or institution makes about what tools must be used.