<body><script type="text/javascript"> function setAttributeOnload(object, attribute, val) { if(window.addEventListener) { window.addEventListener('load', function(){ object[attribute] = val; }, false); } else { window.attachEvent('onload', function(){ object[attribute] = val; }); } } </script> <div id="navbar-iframe-container"></div> <script type="text/javascript" src="https://apis.google.com/js/plusone.js"></script> <script type="text/javascript"> gapi.load("gapi.iframes:gapi.iframes.style.bubble", function() { if (gapi.iframes && gapi.iframes.getContext) { gapi.iframes.getContext().openChild({ url: 'https://www.blogger.com/navbar.g?targetBlogID\x3d11106144\x26blogName\x3dEducational+Technology+and+Life\x26publishMode\x3dPUBLISH_MODE_BLOGSPOT\x26navbarType\x3dBLUE\x26layoutType\x3dCLASSIC\x26searchRoot\x3dhttps://mark.blogspot.com/search\x26blogLocale\x3den_US\x26v\x3d2\x26homepageUrl\x3dhttp://mark.blogspot.com/\x26vt\x3d9181144880663634019', where: document.getElementById("navbar-iframe-container"), id: "navbar-iframe" }); } }); </script>

Educational Technology and Life has moved:
Visit edtechlife.com for new content.

Monday, May 30, 2005

iPod in Education

This is probably the last response to a classmate that I will pass on. I am done with my doctoral coursework!

(A moment of silence, please... unless you feel like cheering for me... or better yet, have a BBQ and a beer for me today!)

Now it's on to the solitary reading, researching, and writing for me.

Ok, that's enough "and life"... back to your educational technology.

Here's your iPod in Education post. The bit in italics was a post by a classmate. Then comes my response...

The Apple Computer iPod music player has many uses for the classroom. More than a music player, the iPod is an exciting new audio tool that can help enhance student learning in any subject area, particularly for language and literacy development. Historical speeches, influential symphonies, conversational Spanish. The iPod and the Griffin iTalk Voice Recorder can be used to record any kind of audio files, from classroom lectures to poetry readings. The possibilities are endless — students can share personal notes, track small group discussions, or conduct interviews.

The iPod can even be used for teacher professional development. Apple has professional development content for teachers to download.




This is a powerful (and powerfully cool) new application of technology in education. I'm lucky enough to have been in a position to see some of this develop (at the OCDE we began offering classes on "iPod in Education" in the fall and one of our coordinators - a guy I had hoped to work with for a long time - got hired away by Apple to be their manager of iPod in education)... but with the growth of the read/write web (and podcasting) innovation is happening at a blindingly fast rate in this field. There is something new about iPods in education in my aggregator every day!

My wife drank the kool aid in the fall and now uses iPod to pre and post assess her kindergarden students, and as an integral part of her Movie Magic after school class for 1st and 2nd graders, who made iMovies in the style of "Reading Rainbow" in which they used the Ken Burns effect to display the pictures from the book... and they read the story for the soundtrack (recorded using iPods and iTalks).

Check this out... Jason Ediger (the new Apple employee I mentioned above) FURLed this today... the education podcasters network at http://www.epnweb.org/

See also this post by Will Richardson (something of an authroity on the read/write web in education) in which he discusses some of the latest development in podcasting... http://www.weblogg-ed.com/2005/05/30#a3613 (it was part II in a series, so you should be able to find more info on his blog.)

Oh, and if you want to subscribe to Ediger's archive... http://www.furl.net/members/jediger

I suppose I should have mentioned RSS as a killer new educational technology! That is how I get most of my news and current research these days. Too bad we don't have time for that conversation here, too.


Thursday, May 26, 2005

Paper on its way...

Those of you who have seen my iChat status showing red and the single word "writing" should know that I didn't walk away and leave my computer unattended. I am actually writing. I have a management of technology for education paper on MMORPGs due on Sunday, and right now I am on track for writing something that is double length and a little too much like a trial run at my literature review for my dissertation! I have a lot of slogging, and a lot of tough decisions ahead of me, but I'll post whatever I am left with Sunday night here at Educational Technology and Life, too.

Incidentally, can you tell I am practicing actually blogging now that I will not be completing weekly assignments for class anymore?

Since this is the last assignment of the last of my coursework in my dissertation program, this is conceivably the last formal university class I'll ever take! (Or at least the last one in which I'll care about my grade rather than simply what I learn!)

Arg... back to it.


Tuesday, May 24, 2005

More on Games and Education

The following is my contribution to an optional "emerging technologies" discussion in the last week of Management of Technology for Educaiton, followed by two responses and my replies.

An emerging technology (in education) that excites me and gives me hope for the future in addressing the needs of students is the application of video games as engaging teaching and learning tools. I am particularly interested in the potential of multiplayer online role playing games as constructivist learning environments, and my final paper will focus on the management issues involved, so I will post it here when I am done.

I'm sure I've shared these books on the subject here before:

Aldrich, C. (2004). Simulations and the future of learning. San Francisco: Pfeiffer.

Prensky, M. (2001). Digital game-based learning. New York: McGraw Hill.

Gee, J. P. (2003). What video games have to teach us about learning and literacy. New York: Palgrave Macmillan.

And here is a brand new one I just received:

Aldrich, C. (2005). Learn by doing: a comprehensive guide to simulations, computer games, and pedagogy in e-learning and other educational experiences. San Francisco: Pfeiffer.


Another book of interest may be:

Iverson, K. M. (2005). E-learning games: Interactive learning strategies for digital delivery. Upper Saddle River, NJ: Prentice Hall.

It is less a book of theory and more a collection of classroom strategies but I found her “games” to be an interesting look at applying the theory to the realities of the online classroom.

Wow. Thanks, Wyll. I am ordering it today. It's almost certainly too late for this paper, but will help with my KAMs and dissertation!

When ordering this, I also discovered and ordered "Engaging Learning : Designing e-Learning Simulation Games" by Clark N. Quin.



It may be a trend that I am seeing lately (maybe more so because you have expressed an interest and shared with the class this term) but have you noticed that recent instructional technology conferences have at least one session on games/simulation/virtual reality etc?

Do you plan to attend any of these conferences to support further research and study?

Yes. And absolutely - as many as work will allow. :)

Last week, the LA Times even ran a story called Geek Fun Isn't Frivolous.

This doesn't really count, but I am offering a course in Games and Learning at the OCDE in August, and I am applying to present the topic at next year's CUE (Computer Using Educators) Conference and next year's NECC (National Educational Computing Conference) Conference. CUE has shown such interest in the topic that they are now courting Clarck Aldrich, James Paul Gee, and Mark Prensky as possible keynote speakers.

I just got finished attending an entire conference dedicated to this topic, the Education Arcade Games in Education Conference. It was great to meet many of the authors and practitioners in the field. I wish I were able to attend the Games, Learning, and Society Conference next month; alas, it may not come to pass. Actually... as I write this... and review the site I just linked to... I realize I better be there. Time to spend my last vacation day (and a sick day?) of the year, spend some more financial aid, and get myself out there.

Boy am I glad you and Wyll responded to my post this week. :)


Sunday, May 22, 2005

Happy feedback... and more on student feedback

I couldn't beleive when I read this response to one of my posts in class, and I am doubly thrilled that it lead to more writing...

As an Instructional Technology Specialist with just one year of experience, I find your discussion so very helpful. Thanks so much for this valuable resource. I am particularly interested in your use of focus groups. Do you have additional thoughts in this area?

Wow. Thanks for the positive feedback, Ada. It is great to think that this work is doing someone (other than me) some good rather than simply fulfilling an assignment requirement.

As for focus groups... I don't know that these things would all be considered focus groups, but while I am an opponent of bureaucracy - and even of decision making by committee - I am very much a proponent of respecting and tapping into the experience and creative energies of those around you, which in addition to strengthening your own data gathering and analysis capacities also has the added benefit of helping the others involved to find a stake in - and ownership of - your decisions.

For instance, as a site technology coordinator, I would take any important decisions to our tech committee, which was a multi-disciplinary committee of representatives from each department in the school. In some cases this was the department head, and in some cases a more technically inclined person was appointed by the department head. This was fine, because any major decisions also went before the department heads and administration at their bi-monthly meetings. This is not to say that these committees made decisions for me (and certainly not in the way that committees often make the least active, most conservative decisions because no one wants to take responsibility for a change or risk), but rather that I was able to get input from the perspective of an expert in each department before making my decision, to engage them in dialog about difficult or controversial decisions, and thus be better prepared to present a final decision that they were more ready to accept. (Now many decisions also had to be taken to the budget and appropriations committees, who did sometimes have a little more decision making power, but once I was able to justify a decision in terms of the previous committees desires, these last two were almost always a slam dunk.)

At the district level I saw the disaster that came from failing to do this, and how my few efforts to solicit input from teachers as part of my decision making process could make such a big difference to them - and to my decisions... it sometimes became perfectly clear what decision to make in a situation I had been stewing over myself.

At the county level, where I now have much more freedom to do things the way I think is best than I did while managing a grant at the district level, I am once again beginning to reap the benefit of better serving our customers (the schools) by including representatives in our planning processes. Though I have done my share of (what I think was) swift and decisive "paradigm busting", I have at the same time created a committee (more of a focus group really) for nearly every project I manage. In fact, I think this has made it possible to make the changes that I have because it is absolutely clear to my superiors and colleagues that the changes are necessary.. thanks to input from those "in the field" who will be using our services. Of course, only five months into the job, many of these changes are still underway - or only in their beginning stages - and it remains to be seen how successful they will be.

So, this is not particularly research based, as posts in this class go, but since you asked, I hope this articulation of my experiences might be of some value.


Even more on surveys

Antoher post I wrote for the survey thread in class...

As you know reponse rates for surveys are low. The nursing program likes to have our graduates assess the program after working a few months to identify gaps or weaknesses in our program. This year we got the most responses (although the rate is still low at 30%) from using an online survey tool. To measure valid outcomes what type of response rate would you expect?

Most of the online surveys I am doing have very high response rates, and this is because there is one factor that is very different from what you would ordinarily expect from an online survey... the respondents are all in the same room. For instance, I am using online surveys as evaluations of a class or class session. Most of the trainings I run have participants sitting in front of computers... it is much more efficient to have them enter data and comments directly into an online survey than to have them fill out paper forms. We save paper, the quantitative results are instantly tabulated and reported, and the qualitative results do not need to be deciphered and keyed in by our clerical staff. (More often than not, these things just don't happen at all with the paper evals... we would just glance at them, maybe make photocopies - for crying out loud - for our supervisors, and then put them in the cabinet for "audit" purposes.)

That being said, I also use them to survey groups like the district technology leaders and our techlink listserv and get fairly high response numbers, but these folks have a long term relationship with us and a vested interest in providing feedback.

I did do one open survey, a needs assessment for our next round of classes, which I asked the district technology leaders to push out to their own site based people and their staff... and got only about a hundred responses... a remarkably small percentage of the tens of thousands of teachers in Orang County. :)

Given my experiences in 8427 and 8437, I think the validity relies more on the representativeness of the cross-section of your population who respond than it does on pure numbers. Unfortunately, when it comes to representative samples, the educators who will reply to an online survey are not at all representative of all educators. This was seen in the bias of the data I received from the needs assessment... there was almost no demand for beginning technology classes -though it is clear that many teachers in the county still require these skills, and a greater demand for the "latest and greatest" than common sense would tell us most teachers have access to.

I hope this was a helpful response. A more direct response to your question, though, is a matter of statistics and measuring the margin or error (or confidence interval) associated with a certain population proportion. I'll admit I pulled out the statistics book again, but think the discussion is probably best left for a statistics class. :(


Donald Kirkpatrick’s 4 Levels of Evaluation

A classmate brought this up and I thought it might be interesting to post here...


This was a model I hadn't encountered yet either, so thank you for sharing your experience with it.

Others interested in Donald Kirkpatrick’s 4 level model,

I poked around online a bit with a google search and found his model listed in San Diego State University's Encyclopedia of Educational Technology, complete with a useful graphical representation of the model.



Follow Up on Student Feedback

A classmate replied to the story I posted yesterday. Here is my response...

Mark, thank you so much for sharing your story. It was thought provoking. Do you have any thoughts on why you got that particular response? Do you think that there was a paternalistic or maternalistic effect? Mother or Father knows best so don't have to ask the children. Mary Ann

Thanks for the response and questions, Mary Ann. I have three thoughts about why my suggestion met with these reactions... and one of these answers the (possibly more interesting) question of why I let it go:

1.) It was actually shocking because people don't often solicit students (or children in general) for their opinion. The county office may be closer to the rest of the world in this respect than it might be to a school.

2.) Many of the people on the committee are not educators... they are employees from the IT, HR, legal, and accounting departments, and so have a mindset more like the rest of the world than an educator might.

3.) And this is the one that explains why I let it go... It would be a lot of extra work to survey employees children, and none of the volunteer committee members at the table (including me) wanted to actually do the work.

Though I don't think this story was a great example of this, I am proud to report that an intern in our department called me a "paradigm buster" on Friday. :) I hope to actually follow through on the teacher training programs he was talking about in a more determined and effective manner than I did on the "Children at Work Day" committee. :)


Performance Standards vrs. Content Standards

An exchange between classmates, and my response...


You wrote: In the Social Studies courses I teach, there is a lot of material to cover. It is difficult to pace the class to cover everything the California standards say we should cover and yet go into enough depth so the students really grasp the concepts.

Georgia is currently undergoing a massive restructuring to eliminate the problem you currently have. Our curriculum was described as having a lot of breath but no depth. You may find this link regarding our Performance Standards interesting information:




This is exciting news - thank you for sharing about the Georgia performance standards. I read all of the FAQs at the site you sent us to, and can say that California suffers from the same sort of problems with our content standards, especially that "it would take twenty-three years—not twelve—to cover the topics included at anywhere near the level of depth necessary for real learning to take place." In fact, at the recent Orange County High School Summit, keynot speaker Dr. Daggett reported that one of the attributes of the top 30 schools in the nation which differentiated them from the next 300 (still very good) schools was that the top 30 took the time to prioritize their content standards and cut them down by 1/3 (which would bring Georgia's 23 years down to a more manageable 15). Naturally I am excited that the performance standards are based on more authentic outcomes, too, but I wonder how this will play out in their standardized testing and the high school graduation test.

I presume you are working in Georgia. Have you had a chance to work with these new standards yet? Regardless, do you have any initial opinions of them or their implementation? Are they receiving any resistance from educators in the state?


Saturday, May 21, 2005

Disadvantages of likert scaling

From another class thread...

Disadvantages of Likert scaling--
• participants may not be completely honest - which may be intentional or unintentional
• participants may base answers on feelings toward surveyor or subject
• may answer according to what they feel is expected of them as participants
• scale requires a great deal of decision-making
• can take a long time to analyze the data

I am a fan of judicious use of the Likert scale in surveys, but I found one key disadvantage missing from this list. Unless there is some kind of descriptive rubric involved, respondents may interpret the scale differently from one another, such that one person's four might be equal to another's 5... and still another's 3. I have often gotten an all fives review for a course I know was sub-par... but perhaps it was not as bad as the last experience that participant had been through. Similarly, I have gotten 4's from people that I know enjoyed and got a lot out of a session I felt was stellar, but in talking to them I realize that there is very little that would ever cause them to use a five rating... that they are saving it for something better than anything they have seen before.

Still, on the flip side, including a rubric of some kind can drasticly increase the decision-making time needed to respond to a question. In fact, in he CTAP2 iAssessment I've mentioned in a few other posts here, teachers used to complain that it was too long, at 45 multiple choice, likert-like, questions. Now, it has been reduced to only 15 questions (or so), but the detailed rubric now makes the assessment at least as time consuming!


More on student feedback... and two kinds of evaluation

From the same thread...

This has turned out to be an interesting thread!

Lisa, you bring up an interesting point when you say that you "like to give the students their rubric (if I am using one) before they begin their assignment." When I was in my credentialing program, a professor of mine promised to never grade us on anything for which we did not already have the rubric. Though I recognized that this limited the number of things he could grade us on, it did not reduce the quality of tangential discussions we had, and it clearly created a secure and stable environment within which we could learn with far less anxiety. I was so happy with it that I have continued to try to hold myself to the same standard since that time. At this point it seems to me as if this aught to be a sort of law of teaching... that if you are going to grade a student on something, they ought to know what it is they are being graded on and what you expect of them.

This of course is an entirely separate issue from evaluating the effectiveness of your own teaching. With their ASSURE model of instructional design, Smaldino, Russel, Heinich, and Molenda (2005) recommend two forms of evaluation: assessment of learner achievement, and the evaluation of methods and media.



Smaldino, S.E., Russell, J. D., Heinich, R., and Molenda, M. (2005). Instructional technology and media for learning. Upper Saddle River, NJ: Pearson Merrill Prentice Hall.

A bit on student feedback

Another response to a thread in class...

I think encouraging student feedback as part of your assessment is an excellent idea

Chris and Mia,

I agree, and this thread has prompted me to share an anecdote that occurred earlier this week.

At the OCDE, which is an event planning machine if nothing else, there is an event called "Children at Work Day." On this day, employees bring their children to work. The original intent was to allow children exposure to their parent's work environment (and apparently it grew out of an attempt to expose girls to the various careers available at the county office), but has degenerated into a day of the various departments taking turns entertaining children. At anyrate, apparently in recent years they've had trouble getting secondary aged students to the event. Another lady who is new to the committee and I had some definite ideas about returning the day to its roots and trying to provide secondary students (at least) with exposure to the work environment. We had some creative ideas, and there was some disagreement about what the students would like. I suggested (of course) that we could survey the employees children and see what they would like to get out of the day. The reactions in the room were astounding. Some reacted as if I had just uttered the solution to achieving world peace, but were quickly sort of shushed by others who said "no, no, we can just..."

Initially I was shocked that they were shocked. They I was amazed that the idea was shot down. I don't know why we should not always include our students' feedback as part of our evaluation process whenever possible... they (especially at the secondary level) may know best whether or not our instruction is effective.


More on Survey Monkey

Another response from class...

This discussion topic has made me realize that as a library staff, and more specifically, the supervisor of bibliographic instruction and Head of Reference, I really need to look more closely at assessing or evaluating our services. Currently, our assessment of each is very informal; almost non-existent.


I was thrilled to read this example of powerful reflection in online discussion!

One service I bring up often, and which I have already brought up in this class this week, but which I think might prove to be a valuable tool for you, is Survey Monkey. I pay $19.99 a month to be able to implement unlimited online surveys of an unlimited amount (though I am charged more if I go over 1000 responses in a month), but teachers can use this for free within the limit of 10 questions and up to 100 responses per survey, which should be plenty for many educator's needs. For instance, you could easily do a monthly (or even weekly) survey of your staff to evaluate your services. After a good Google search - or when one is not appropriate, an online survey is now one of my first responses when I require data, or simply wonder something.

I am not attempting to sell anything here, just passing on a link to a service that I have found valuable in many contexts - for my research and for my work. :)


More on the CTAP2 iAssessment

A response to a classmate...

I would like to use a bi-annual online district survey (pre- and post-training participation) to get an idea of the effectiveness of training at each grade level and content area.

Evelyn, I too am a fan of the online survey, and use Survey Monkey regularly. In my initial post in this forum, I included a sample course evaluation survey. In California, the state has also provided a tool for collecting and reporting data on teacher and student use of educational technology. You can explore this at http://ctap2.iassessment.org if you are interested.

Though I have found this tool useful, I am afraid that many sites simply complete it because they have to (it is a requirement for many state programs and grants) and then never make use of the data to "determine levels of integration and to identify areas of need" and to use "the results would [as] rationale for staff development initiatives for the upcoming years" as you suggest. Naturally, I consider this a waste, and try to encourage people to both use it, and use the data.


Educational Technology Assessment - Part II

My response to the second prompt of the week in "Management of Technology for Education"...

Congratulations! You have been instrumental in implementing technology in your school. Thanks to your vision, technical expertise, successfully funded proposal, rapport with senior administrators, technology training and staff development efforts, your school has been using technology in almost all subject areas for the past year. It is now time to evaluate your efforts.

a) Describe the methods and procedures you will undertake to evaluate assessment of technology in teaching. How will you go about collecting data... Questionnaires, personal interviews, observations, small group meetings?? How will you determine if technology has made a difference?

b) Provide detailed information on your program assessment plan. Include sample questions you will incorporate in collecting data from participants in your study.

As always, support your comments with research, and also comment on other students' posts.


For the purposes of answering this prompt, I could answer as if I were once again running the educational technology program at a school site, however I think this will be an even more meaningful exercise for me if I look at this as an opportunity to develop the advice I could give a site technology coordinator who asked how to evaluate their programs.

This topic overlaps quite a bit with the previous one, so I will sometimes refer back to my previous post.

In fact, to begin with, as I stated earlier this week, the evaluation of a program depends heavily on the initial needs assessment for the program. Why was the program implemented? Did it do what it was meant to do? Did it meet the identified need(s)?

An effective evaluation also depends on good evaluation design. Oliver (2000) suggests six formal steps to evaluation design:

"1. Identification of stakeholders
2. Selection and refinement of evaluation question(s), based on the stakeholder analysis
3. Selection of an evaluation methodology
4. Selection of data capture techniques
5. Selection of data analysis techniques
6. Choice of presentation format" (Oliver, 2000)

I also found his articulations of the three elements of evaluating an educational technology to be helpful as an initial focus.

"• A technology
• An activity for which it is used
• The educational outcome of the activity" (Oliver 2000)

This focus, and the above suggested steps are good as broad starting points and a formal framework, but there are many "challenges presented by the evaluative contexts... [and] a large number of possible contextual variables, operating ependently and interactively at several levels" (Anderson et al, 2000) within a school's educational technology program; these are made all the more complicated by the various implementations that will occur depending on the subject, department, or faculty member.(Anderson et al, 2000)

Two methods for addressing these challenges are to form a multidisciplinary research team and to implement a multi-method research design. Anderson et al (2000) describe the benefits of these evaluation methods:

"There were two fundamental (and familiar) aspects of our approach to evaluation which we felt – both in prospect and in retrospect – put us in a good position to tackle the general challenges outlined above. The first was to have a multi-disciplinary research team, whose members would bring to the investigation not only knowledge about educational technology, evaluation, and learning and teaching in higher education, but also sets of research skills and approaches that were distinctive as well as complementary. The second broad strategy was to have a multi-method research design, which involved capitalising on documentary, statistical and bibliographic materials already in the public domain, reviewing records held and reports produced by the projects themselves, as well as devising our own survey questionnaires and interview protocols in order to elicit new information." (Anderson et al, 2000)

The authors also suggest a variety of slightly more specific evaluation strategies:

  • tapping into a range of sources of information
  • gaining different perspectives on innovation
  • tailoring enquiry to match vantage points
  • securing representative ranges of opinion
  • coping with changes over time
  • setting developments in context
  • dealing with audience requirements" (Anderson et al, 2000)

  • I would specifically recommend some techniques that have worked for me.

    1. Observations - Walk around the campus. How are computers being used in the classroom? How are they being use in the library or computer labs? What assignments are students completing with their computers, and products are students creating? How are these things different from the way students were learning and creating in the past? Much can be learned (and many a dose of reality swallowed) when an educational technology coordinator gets into the field on a random day to see how the rubber meets the road.

    2. Focus groups and/or informal interviews - A survey or other more formal evaluation instrument can be severely biased if the evaluator is simply asking for the things he or she thinks he needs to be asking. Conducting observations can go a long way towards helping an evaluator understand what needs to be formally evaluated, but not only can he or she only observe a limited subset of the program's entire implementation, but the issue of altering the observation by the mere presence of the observer comes into play as well. By including others (ideally an multi-disciplinary group, see above) the evaluator can gain valuable insight and create a more effective survey in turn. The following are examples of questions from a focus group discussion I lead during a district technology leaders meeting during which we had representatives from each district present.

    In what ways do districts currently use OCDE Educational Technology training facilities and trainers?

    In what ways could our training facilities and trainers be used to better meet district technology staff development needs?

    Specifically, what classes would you as district technology leaders like to see offered in the county training labs?

    Specifically, what classes would you as district technology leaders like to see offered through our "custom training" program?

    In what ways can our training programs best affect positive changes in student learning?

    3.) Conduct surveys - Online survey's in particular are now easy to administer (particularly when learners are sitting in front of computers anyway) and to analyze (most services will do simple analysis for you)... though their design should not be approached in any less careful a manner. I use Survey Monkey as an evaluation follow each professional development event that I manage. There is no reason a site tech coordinator or classroom teacher couldn't do the same thing. Samples of the question we use at the OCDE can be found at this sample survey.

    4.) Focus groups again - A single-minded and easily biased analysis of the data by the evaluator is not nearly as valuable as organizing, or re-convening, a multi-disciplinary team to review the results.

    In California, any technology coordinator can also use the CTAP2 iAssessment survey for teachers (and for students) to evaluate technology use on their campus.

    Of course, when it comes to the effectiveness of any instructional program, one can look to state test scores as well, though the evaluator may be more concerned about other learning outcomes than what a standardized state test measures. In this respect, it might be best to develop an evaluation strategy for tracking authentic outcomes. In many ways, for this to be effective it might require continued communication with students for years after they leave a school.

    Finally, Shaw and Corazzi (2000) share a set of nine "typical purposes of evaluation" that might be kept in mind when designing an evaluation process specific to a given school site.
    "1. measurement of achievement of objectives of a programme as a whole
    2. judging the effectiveness of course or materials
    3. finding out what the inputs into a programme were- number of staff, number and content of contact hours, time spent by the learner, and so on
    4. ‘mapping’ the perceptions of different participants – learners, tutors, trainers, managers, etc
    5. exploring the comparative effectiveness of different ways of providing the same service
    6. finding out any unintended effects of a programme, whether on learner, clients or open learning staff
    7. regular feedback on progress towards meeting programme goals
    8. finding out the kinds of help learners need at different stages
    9. exploring the factors which appear to affect the outcomes of a programme or service" (Thorpe, 1993, p. 7, as cited in Shaw and Corazzi, 2000)

    Again, I suspect this may be too lengthy to be terribly valuable to others in the class, but it has been a valuable exercise for me.



    Anderson, C., Day, K., Haywood, J., Land, R., and Macleod, H. (2000). Mapping the territory:issues in evaluating large-scale learning technology initiatives. Educational Technology & Society. 3(4). Available http://ifets.ieee.org/periodical/vol_4_2000/anderson.html

    Oliver, M. (2000). An introduction to the evaluation of technology. Educational Technology & Society. 3(4). Available http://ifets.ieee.org/periodical/vol_4_2000/intro.html

    Shaw, M. and Corazzi, S. (2000). Avoiding holes in holistic evaluation. Educational Technology & Society. 3(4). Available http://ifets.ieee.org/periodical/vol_4_2000/shaw.html

    Educational Technology Assessment - Part I

    This post is a bit too "research based" for my tastes, but as I summarized for my classmates, it was a valuable exercise...

    Teaching Assessment: Describe briefly how you assess your teaching performance in the classroom (or any instruction you give as part of your job). Are you satisfied with this method. What are some of the advantages/disadvantages of the method(s) you currently use.

    I am currently teaching professional development courses in educational technology at the Orange County Department of Education, and have just completed the process of assessing and planning the summer schedule, so I will consider this process from start to finish in answering this prompt.

    In my formal and informal assessments, I make an effort to use both qualitative and quantitative assessments. Oliver (2000) supports this philosophy.

    "On the one hand, quantitative methods claim to be objective and to support generalisable conclusions. On the other, qualitative methods lay claim to flexibility, sensitivity and meaningful conclusions about specific problems. Quantitative evaluators challenged their colleagues on the ground of reliability, sample validity and subjectivity, whilst qualitative practitioners responded in kind with challenges concerning relevance, reductionism and the neglect of alternative world views." (Oliver, 2000)

    In reading his paper I also discovered that a "new philosophy has emerged" which seems to mirror my own fierce focus on pragmatism.

    "A new philosophy has emerged that eschews firm commitments to any one paradigm in favour of a focus on pragmatism. Rather than having a theoretical underpinning of its own, it involves a more post-modern view that acknowledges that different underpinnings exist, and adopts each when required by the context and audience." (Oliver, 2000)

    Though it may not be appropriate for the work we will do in academia for Walden, the philosophy Oliver elaborates validates many of my decision making priorities as a practitioner.

    "Central to this view is the idea of evaluation as a means to an end, rather than an end in itself. Methodological concerns about validity, reliability and so on are considered secondary to whether or not the process helps people to do things. Patton provides various examples of real evaluations that have been perfectly executed, are well documented, but have sat unread on shelves once completed. In contrast, he also illustrates how “quick and dirty” informal methods have provided people with the information they need to take crucial decisions that affect the future of major social programmes." (Oliver 2000)

    Most importantly, he describes such a pragmatic practice as requiring "
    the creation of a culture of reflective practice similar to that implied by action research, and has led to research into strategies for efficiently communicating and building knowledge" (Torres, Preskill, & Piontek, 1996, as cited in Oliver, 2000).
    In implementing this philosophy I begin with what Scanlon et al (2000)would consider the "context" of the evaluation. In order to evaluate the use of educational technology "we need to know about its aims and the context of its use" (Scanlon et al, 2000). Ash (2000) also suggests that "evaluation must be situation and context aware."

    In order to understand the context of my evaluations, I first performed a broad needs assessment via focus groups (such as the quarterly district technology leaders meeting) and survey (using surveymonkey.com and a listserv) to set the goals for the professional development schedule. I use course descriptions developed in partnership with the instructors and others in my department to further determine the goals of individual courses. Finally, on the day of a course (and sometimes before the first day via email) I always ask the participants to introduce themselves, explain where they work, and what they hope to get out of the class. This helps me to tailor that specific session to the individuals in the room. (I also ask all of the other instructors to do the same.)

    During a class I monitor what Scanlon et al (2000) might call "interactions." (Scanlon et al) because "observing students and obtaining process data helps us to understand why and how some element works in addition to whether it works or not." I often check for understanding, and always include "interactive modes of instruction" (NSBA, n.d.).

    Due to my initial and ongoing assessments, following a course I am able to focus on what Scanlon et al (2000) might call the "Outcomes" of a course. "Being able to attribute learning outcomes" to my course can be "very difficult...  [so] it is important to try to assess both cognitive and affective learning outcomes e.g. changes in perceptions and attitudes" (Scanlon, 2000). I use formal evaluations, which include both likert scale questions and open ended questions. For some special events, such as the Assistive Technology Institute - which we put on for the first time this spring, I will follow up the initial evaluation of the session by an additional online survey a week later. The real test of my success, though, is an authentic one... it is whether or not the teachers and administrators return to their sites and apply what they have learned. A dramatic example of this sort of authentic evaluation came following the blogging for teachers classes I ran over the past two months. After the first few weeks, it was clear that teachers were not using their blogs (for I had subscribed to all of them using my feed reader). Bringing this up in subsequent training sessions lead to productive discussions of the barriers, and eventually (and primarily simply because we followed up, I believe) they began using them, and I am now often greeted when new posts when I return to my reader.

    Ultimately, "good assessment enhances instruction" (McMillan, 2000), and I believe that such authentic assessments are the only way for me to know the true impact of my programs. I hope to be able to include more such authentic follow-up assessments in the coming months.

    Because the county programs operate largely on a cost recovery model, by which districts pay for services rendered, cost is also a factor in my assessment of the professional development programs I manage. “An organisation is cost-effective if its outputs are relevant to the needs and demands of the clients and cost less than the outputs of other institutions that meet these criteria” (Rumble, 1997, as cited in Ash, 2000). To determine the cost effectiveness of a program...

    "evaluators need to:
  • listen and be aware of these aspects and others;
  • focus the evaluation towards the needs of the stakeholders involved; and
  • continue this process of communication and discussion, possibly refocusing and adapting to change, throughout the study (what Patton refers to as “active-reactive-adaptive” evaluators)." (Ash, 2000)

  • Unfortunately, "the area is made complex by a number of issues that remain open for debate" (Oliver, 2000).
    "These include:
    • The meaning of efficiency. (Should it be measured in terms of educational improvement, or cost per day per participant, for example?)
    • The identification of hidden costs. (Insurance, travel costs, etc.)
    • The relationship between costs and budgets. (Are costs lowered, or simply met by a different group of stakeholders, such as the students?)
    • Intangible costs and benefits. (Including issues of quality, innovation and expertise gained.)
    • Opportunity costs. (What alternatives could have been implemeneted? Moreover, if it is problematic costing real scenarios, can costs be identified for hypothetical scenarios and be used meaningfully as the basis for comparison?)
    • The use of ‘hours’ as currency. (Are hours all worth the same amount? If salary is used to cost time, how much is student time worth?)
    • Whether something as complex as educational innovation can be meaningfully compared on the basis of single figures at the bottom of a balance sheet." (Oliver & Conole, 1998b, as cited in Oliver, 2000)

    I work in a strange hybrid of a business and public institution with further complicates this issue, such that sometimes it is not best to be cost effective as long as a service is valuable, or for political reasons, is perceived as valuable.

    This has been a valuable reflection for me. I hope the large blockquotes did not make it too difficult to read, and I look forward to any of your comments.



    Ash, C. (2000). Towards a New Cost-Aware Evaluation Framework. Educational Technology & Society. 3(4). Available http://ifets.ieee.org/periodical/vol_4_2000/ash.html

    McMillan, J. H. (2000). Fundamental assessment principles for teachers and school administrators. Practical Assessment, Research & Evaluation. 7(8). Available http://PAREonline.net/getvn.asp?v=7&n=8

    NSBA. (N.D.) Authentic learning. Education Leadership Toolkit: Change and Technology in America's Schools Retrieved May 20, 2005 from http://www.nsba.org/sbot/toolkit/

    Oliver, M. (2000). An introduction to the evaluation of technology. Educational Technology & Society. 3(4). Available http://ifets.ieee.org/periodical/vol_4_2000/intro.html

    Scanlon, A. J., Barnard, J., Thompson, J., and Calder, J. (2000). Evaluating information and communication technologies for learning. Educational Technology & Society. 3(4). Available http://ifets.ieee.org/periodical/vol_4_2000/scanlon.html

    Sunday, May 15, 2005

    2005 Education Arcade Conference

    I attended the "Spark Session" from 5 to 7 this evening at the LA Convention Center. The preparations for E3 are already looking spectacularly impressive, and I can't believe the variety of amazing educational technologists I met in less than two hours at the EducationArcade.org pre-conference opening event... Researchers, developers, consultants, etc. Within ten minutes of walking into a room full of people I didn't know I had enjoyed two conversations about MMORPGs in education! (After talking to only 3 people!) By the end of the night I'd run out of business cards and come home with a pocket full of contacts - and potential Delphi panel members. It is an amazing thing to be (from time to time) surrounded by like minded people who are experts in the things you are interested in.

    I hope I find the time to blog more about this conference than I did about the CUE conference in March or about my residency in April. But, for tonight, I am tired and I need to get to work early to set up for the district technology leaders meeting at the OCDE. Right now I can't believe I need to be there, and I can't wait to blow out of there at 11:30 to get back to the most important event related to my research for the next year! At least my colleagues from the N-MUSD are ditching the meeting to be at the conference and will record the AM sessions (with iPod and iTalk) for me. ;)

    Even if I don't get to blog as the week progresses, I will be able to post my final for the quarter in two weeks, and this will include material from the conference. I start my KAMs in three weeks... and my dissertation is right around the corner!



    Consider the Affective Filter in Professional Development

    Sometimes the fear of a computer (or of damaging a computer, or of simply looking stupid when using a computer) can sometimes be enough to ruin the experience of an educator in a technology professional development session. Tonight I was reminded of the affective filter concept...

    I agree with you that collaboration should not be forced. Many times learners are placed in situations where they are not comfortable. The lack of comfort is not for the material being taught, but the social setting of the collaboration. It is vital to ensure that the needs of the learner are taken into consideration.


    You are right. When collaboration is forced, this can serve to raise the affective filter for the learner.

    I found a reasonable definition of "affective filter" at this site.

    ""Affective Filter" is the term Stephen Krashen has used to refer to the complex of negative emotional and motivational factors that may interfere with the reception and processing of comprehensible input. Such factors include: anxiety, self-consciousness, boredom, annoyance, alienation, and so forth."

    The site also includes several practical suggestions for instructors.

    "We maintain low affective filters in the following ways:

  • We do not test students on the material they are working with. This eliminates a major source of anxiety. The only testing in the program is for placement purposes... whatever anxiety this generates is associated with the infrequent placement procedure, not with the daily classroom environment.
  • We do not require students to perform when they are not ready and willing to do so. Speaking is always voluntary and always welcome; hence, it is genuine speaking, in contrast to the embarrassed, strained output that passes for speaking in some methods. We never make our students feel awkward or self-conscious by putting them on the spot.
  • We use authentic materials -- feature movies, newspapers and magazines, popular fiction, etc. -- rather than ESL textbooks and the like. Boredom is less likely with these materials, since they are the kinds of things normal people enjoy in real life.
  • We do not use exercises, drills, or any kind of artificial task that has no ostensible or sensible purpose other than language practice. Instead, we maintain a flow of ordinary, meaningful language about people, places, things, ideas, stories, and so on. Such activities do not become annoying; they are universally accepted as normal, basic modes of human interaction.
  • Teachers function as partners and mentors (positive roles) but not as testers and judges (negative roles). All testing and placement is done at the program level, not by the individual teachers. This helps prevent feelings of alienation and hostility toward teachers.
  • Frequent placement testing... enables us to keep students in groups that reflect their current needs and abilities. Since all of the students in a class have similar skill profiles, they function well as a community. This helps maintain positive attitudes and good will among the class members."

  • These suggestions may also be valuable to our efforts in educational technology professional development.


    Needs Assessment for Professional Development

    Tonight I dug up some material on needs assessment from a previous quarter...


    These classifications of needs assessments are very familiar, but I haven't been able to sort out where I've seen them before... probably here at Walden.

    I did manage to rediscover two other systems I had been exposed to here at Walden, though.

    Smaldino, Russel, Heinich, and Molenda (2005) advocate analyzing learners as the first step of their ASSURE model of instructional design.

    "Several factors... are critical for making food methods and media decisions:

  • General Characteristics
  • Specific entry competencies
  • Learning Styles

    General characteristics include broad identifying descriptors such as age, grade level, job, or position, and cultural or socioeconomic factors. Specific entry competencies refer to knowledge and skills that learners either possess or lack: prerequisite skills, target skills, and attitudes/ The third factor, learning style, refers to the spectrum of psychological traits that affect how we perceive and respond to different stimuli, such as anxiety, aptitude, visual or auditory preference, motivation, and so on." (*p. 49)

  • Before even reaching this step in their analysis, Morrison, Ross, and Kemp (2004) suggest that the instructional designer first needs to identify the problem in order to determine whether or not instruction should indeed be part of the solution. (p. 31) As part of this needs assessment process, they consider normative needs, comparative needs, felt needs, expressed needs, anticipate or future needs, and critical incident needs.

    "A normative need is identified by comparing the target audience against a national standard." (p. 32)

    "Comparative needs are similar to normative needs... a comparative need, however is identified by comparing the target group to a peer that is another company or school as opposed to a norm." (p. 33)

    "A felt need is a desire or want that an individual has to improve either his or her performance or that of the target audience. Felt needs express a gap between current performance or skill level and desired performance or skill level." (p. 34)

    An expressed need is "a felt need turned into action." (p. 34)

    "Anticipated needs are a means of identifying changes that will occur in the future. Identifying such needs should be part of any planned change so training can be designed prior to implementation of the change." (p. 35)

    "Critical incident needs [are] failures that are rare but have significant consequences - for instance, chemical spills, nuclear accidents, medical treatment errors, and natural disasters such as earthquakes, hurricanes, and tornados." (p. 35)

    I know this is a lot of copied text, but I hope these needs assessment concepts will add something to our discussion of professional development planning.



    Morrison, G.R., Ross S. M., and Kemp, J.E. (2004). Designing Effective Instruction. (4th Ed.) Hoboken, NJ: John Wiley & Sons, Inc.

    Smaldino, S.E., Russell, J. D., Heinich, R., Molenda, M. (2005). Instructional Technology and Media for Learning. (8th Ed.) Upper Saddle River, NJ: Pearson Merrill Prentice Hall.

    Saturday, May 14, 2005


    I've been meaning to write about this since the residency in April, so I am glad one of my classmates brought it up...

    By providing the opportunity for hands on, teachers can try it for themselves. This speaks to "Triability."


    I have encountered this term frequently of late. At the residency in Tampa last month, Dr. Ches Jones spoke about Diffusion Theory and the adoption of innovation. He was concerned with public health issues, such as the use of child safety seats (car seats), but the theories he discussed are directly applicable to our work as professional developers in the field of educational technology.

    Referring to the work of Rogers (1995), Dr. Jones spoke about the five elements needed for the successful diffusion of an innovation:

    "1. Relative advantage – the degree to which an innovation is perceived as better than the idea it supersedes.  Like the mobile telephone, the greater the perceived relative advantage of an innovation, the more rapid its rate of adoption will be.

    2. Compatibility – the degree to whichan innovation is perceived as being consistent with the existing values, past experiences, and needs of potential adopters.  An idea that is incompatible with the ideas and norms of a social system will not be adopted as rapidly as an innovation that is compatible.  For example, the use of contraceptive methods in Moslem and Catholic countries where religious beliefs discourage use of family planning is an incompatible innovation.

    3. Complexity – the  degree to which an innovation is perceived as difficult to understand and use.  New ideas that are simple to understand are adopted more rapidly than innovations that require the adopter to develop new skills and understanding.   

    4. Trialibility – the degree to which an innovation may be experimented with on a limited basis.  New ideas that can be tried in smaller stages will generally be adopted more quickly than innovations that are not divisible.  People are more inclined to bite off a pilot of an idea or try a new product if it does not require a long-term investment or commitment.

    5. Observability – the degree to which the results of an innovation are visible to others.  The easier it is for individuals to see the results of an innovation, the more likely they are to adopt it.  Solar collectors are often found in neighborhood clusters in California, with three or four located in the same block.  Other consumer innovations like home computers are relatively less observable, and thus diffuse more slowly." (Cognatek Group)

    With respect to the adoption of educational technologies by teachers, it became immediately clear to me that we suffer in many of these areas...

    The medium or long term relative advantage is difficult for teachers to see when in the short term adoption is a clear relative disadvantage due to the need to learn new (and potentially difficult) things. New technologies are not only commonly incompatible with earlier technologies, but also with earlier teaching paradigms. Clearly, innovations in educational technology are often perceived by many teachers to be difficult to understand and use, whether or not this is actually true. Triability is often not available for teachers, particularly due to the lack of professional development time to learn and experiment with the innovations... and due to the lack of a "test" class to learn with; any trials a classroom teacher does are done on in what IT folks would call their "production environment," their real students. (No one in IT would ever consider installing a new server OS on their real servers before trying it elsewhere first!) Finally, in many cases, we definitely do not have observability... if there is a stellar technology using educator down the hall, or even next door to, a very traditional educator, the use of innovative technologies can go completely unnoticed by the traditionalist... and even in the break room, it is not like technology using educators walk around with a badge or sticker (like "I voted") announcing to the world that they are using something innovative in their classroom. If our professional development efforts in educational technology are to be successful, we will need to address each of these issues head on.

    Also, I was unable to relocate the specific post this evening, but on one of the blogs I read in my daily RSS feeds (Joystiq if memory serves, though a search there turned up nothing), I read about a theory that people are becoming desensitized to traditional advertising campaigns - and even to the sort of subtle embedded advertising that happens in TV, movies, and now video games. The article suggested that trialability will become an increasingly important advertising and marketing strategy as consumers become increasingly convinced that getting their hands on and using a new product is the best way to judge its worth. I see this in educational technology as well. I increasingly refuse to purchase hardware or software for a school or district (or county office) without first being able to try a fully functioning version of it before committing to the purchase... and probably for a lengthy period of time, perhaps a month or more.


    PS. I really wish the word were "triability"... this is much more elegant.


    Rogers, E. M. (1995). Diffusion of innovations (4th ed.). New York: The Free Press.

    Cognatek Group. (2004). Module 7: KM as a Business Strategy. KM Concepts. Available http://www.metainnovation.com/researchcenter/courses/kmconcepts/KM_Concepts_Module_6_files/km_concepts_module_6.htm

    Schools that Learn Revisited

    A response to a classmate I thought I might share here as well...

    The fifth discipline is an excellent resources for building great organizations. Therefore, my staff development philosophy comes from Senge’s fifth discipline and that is to see patterns rather than events. I would focus on making changes and educating staff on topics that will effect pattenrs within the organizations.


    I have enjoyed Senge's The Fifth Discipline both times I've studied it... once in an educational leadership course in my masters program, and once here at Walden.

    While at a Residency in Seattle last year, I visited a used book store and managed to pick up The Fifth Discipline Field Book: Strategies and Tools for Building a Learning Organization by Senge, Kleiner, Roberts, Ross, and Smith. This book is intended as a guide for "companies, businesses, schools, agencies, and even communities" (as the jacket advertises), and - designed for browsing - it includes a variety of practical material such as solo and team exercises, innovations in organizational design, theory and methods, and "stories that incorporate systems archetypes or other applications of systems thinking" (p. 9). Though schools are mentioned in this book, the focus is far more broad, as was the focus of the original book.

    Some time later, I ordered Schools that Learn: A Fifth Discipline Fieldbook for Educators, Parents, and Everyone Who Cares About Education by Senge, Cambron-McCabe, Lucas, Smith, Dutton, and Kleiner. The scenarios in this volume focus on schools of all levels. It begins with a primer of the five disciplines, and so would be effective as a stand alone volume if anyone is interested. The book is then organized into sections that focus on the classroom, the school, and the community.

    In EDUC-6310, I prepared a presentation of this book, and in case anyone is interested, I have attached that work to this post. I'll warn you that it is unnecessarily lengthy for the most part, but it might help you get the gist of the book and decide whether or not it is worth adding to your own library.


    My presentation of Schools the Learn.

    Using Technology to Support Professional Learning Communities

    Written for class, this is one of the many intersections between my classwork, and my work at the OCDE...

    My training or staff development philosophy is based on the idea that teachers are members of a professional learning community...


    Your discussion of professional learning communities caught my attention. This is a philosophy that has gathered a good deal of momentum in Orange County, and the Department of Education has done a lot of work supporting the development of PLCs in Orange County Schools.

    In January these efforts were focused on the two-day workshop Student Success... Whatever It Takes, at which my colleague in the Educational Technology Department, Jason Ediger, presented his philosophy of using technology to support professional learning communities. Jason has since moved on to begin work for Apple computer as the manager for iPod in education, so I have been lucky enough to take over the delivery of his PLC presentation, particularly in the AB 75 and Private School Principal's Academy programs I manage.

    The presentation, based on the work of Richard and Becky DuFour, focuses on "laying the foundation" (of shared mission, values, and goals), "developing high-performing teams" (by facilitating communication and the sharing resources), and "developing a results oriented culture." The discussion of specific technologies that can be used to facilitate these things includes project management systems, assessment applications, note-taking (and voice recording) software, RSS feeds and readers, blogging, mind-mapping software, videoconferencing, and online resources.

    The big "A-HA!" at the end of the presentation is that the skills learned and refined by members of a PLC are the same skills we desire to impart to our students in the 21st century.

    "PLC members will...

    - Access, manage, integrate, and evaluate information
    - Construct new knowledge
    - Communicate with others in order to participate effectively in society"


    Thursday, May 12, 2005

    A professional development philosophy...

    Written for class, of course... but it also relates directly to the subtitle of this blog!

    Put yourself in a leadership role. If you were in charge of planning staff development and/or training for your place of employment ......

    I happen to actually be the coordinator of professional development in educational technology for the Orange County Department of Education. However, most of the time I am still not in a leadership role with respect to an organization, since individuals or schools seeking specific training are our most common customers. However, I am sometimes asked to work as more of a partner with a district or site to help them determine a long term plan for their educational technology professional development. I will approach this assignment as if this scenario is the case.

    1.What would be your training or staff development philosophy?

    In short, I subscribe to a constructivist philosophy of teaching and learning (though I acquiesce that some behaviorist teaching and learning is sometimes called for, I think for the most part we would benefit from a much higher ratio of constructivist teaching, especially once the learners, adult or otherwise, have learned to read, write, and compute... so that they are now able to read, write, and compute to learn.) About a year ago, I began trying to really break down my brand of constructivism into its constituent parts. After toying with the phrase of project-based learning and realizing the final product was not a critical component of powerful learning, I realized I was really after context-embedded, inquiry-driven, and socially negotiated learning.

    In simpler terms, I now focus on the elements of context, inquiry, and collaboration for most courses or lessons I design.

    In order to support a learning environment of context, inquiry, and collaboration within an organization, I have embraced the philosophy of Professional Learning Communities, which the county office is currently supporting.

    It is also worth noting that I find the idea of teaching to adult learners to be (mostly) meaningless, in that when I read about the concerns and the accommodations that are necessary, it simply sounds like "good teaching" to me, principles that should be applied to any learner. All learners have lives outside the learning environment, all learners come to you with the baggage of their past experiences, all learners need to see the relevance of what they are learning, and all learners need to learn in a way that capitalizes on their learning style, interests, and aptitude.

    2. Explain why you decided to choose this philosophy for your organization.

    These precepts are well supported by research. In fact, the ideas of context, inquiry, and collaboration all appear in our week's reading.

    Context is the least represented, but with respect to the organizational context in which professional development must take place, Butler (2001) suggested, in bold caps, that "STAFF DEVELOPMENT SHOULD BE BASED ON THE EXPRESSED NEEDS OF TEACHERS REVEALED AS PART OF THE PROCESS OF COLLABORATIVE PLANNING AND COLLEGIAL RELATIONSHIPS." She also shared that "Griffin (1982) identifies a number of organizational context issues that might affect the design of staff development and change efforts [including]... the school's history of change, and the importance of the leadership's ability to analyze the characteristics of the setting and school." In the case of professional development for educators, what they are learning must be embedded squarely into what they are trying to accomplish in a classroom (and the more this can be accomplished through simulations or role playing during a development session the better).

    A big part of achieving this - of embedding learning in a meaningful context - is helping the learners see the relevance of their learning to their own work and interests. So, it is also important that "teachers identify and collect data in an area of interest, analyze and interpret the data, and apply their findings to their own practice" (Butler, 2001). And, because "the INQUIRY approach will become more widely used as the teacher-as-learner/teacher-as-reflective-practitioner paradigm takes hold" (Butler, 2001), it is important to help foster this paradigm shift. One way to encourage such reflection is to encourage more collaboration between staff teachers.

    Butler (2001) shared the definition of collaboration as "focused interchange with fellow teachers to give and receive ideas and assistance." She went on to suggest that "staff development is most influential where it ensures collaboration adequate to produce shared understanding, shared investment, thoughtful development, and the fair, rigorous test of selected ideas; and where it requires collective participation in training AND implementation." However, this is not something that can be mandated; Butler goes on to warn that "induced collaboration" can carry "high costs in time spent on adjusting to working together and in risk of being exposed to new kinds of criticism and conflict in small groups" and that "forced collegiality doesn't work." Still, a collaborative atmosphere for learning can be encouraged, and "a collaborative culture that must be facilitated and supported by leadership so that informal collegiality supports the formal collaborations required in staff development programs." Ultimately, the leader must understand and support "the concepts of collaboration and norms of collegiality."

    The concept of a professional learning community (PLC) embodies these principles of learning well. For brevity's sake, below is a succinct definition of a PLC offered by the North Central Regional Educational Laboratory.

    "The term professional learning community describes a collegial group of administrators and school staff who are united in their commitment to student learning. They share a vision, work and learn collaboratively, visit and review other classrooms, and participate in decision making (Hord, 1997b). The benefits to the staff and students include a reduced isolation of teachers, better informed and committed teachers, and academic gains for students. Hord (1997b) notes, "As an organizational arrangement, the professional learning community is seen as a powerful staff-development approach and a potent strategy for school change and improvement."

    3. Given that staff development time is limited, what would you set as a priority for staff development?

    As with any organization wide implementation, a needs assessment is absolutely critical, and I would set this as my first priority. (I should note that Walden's Dr. Howard Shechter, who works in knowledge management, has a different perspective on this... his clients usually come to him knowing what they need, and he often skips this step quite successfully... and I have found that this is often the way things work in "real life" - the needs assessment is an integral part of daily work and has logically already moved from the data gathering stage into the analysis stage by the time someone realizes there is a problem and that professional development is needed.) I would also make the focus on context, inquiry, and collaboration a high priority. (In fact, I am doing this right now for all the professional development programs I am managing.) I would prioritize the process of a PLC only in so far as it helped to achieve these primary goals.

    Now, if the learners have not yet learned to read, write, and compute, then certain preliminary proficiencies may need to be made first priority. (This would be only for those learners - not for everyone!) Some behaviorist training may be most effective in achieving these preliminary goals.

    4. Describe how you would approach the delivery of the training or staff development.

    The model that is available at the county can be scaled down to a district quite effectively. Optional classes should be available for those interested in furthering their own growth. For development required for a department or site, custom training solutions should be created (that are delivered on site and on the systems the learners use, in an effort to embed the learning in the context of their work). Both the optional classes and the custom trainings should adhere to the philosophies of context, inquiry, and collaboration. Certification should be available (through optional classes and custom trainings) for teachers who need to master their preliminary proficiencies. Finally, online education should also be available, particularly for the optional classes, in order to ensure that the largest number of people could have access to the growth opportunities they desire.



    Butler, J.A. (2001). sTAFF DEVELOPMENT NW Regional Educational Laboratory. Available http://www.nwrel.org/scpd/sirs/6/cu12.html

    North Central Regional Educational Laboratory. (2004). Professional learning community Available http://www.ncrel.org/sdrs/areas/issues/content/currclum/cu3lk22.htm

    Tuesday, May 10, 2005

    Blogs: The Great Conversation 2.0

    It's late, but I need to tell these stories...

    On March 9th I posted this post about MMORPGs in Educaiton, in which I cited Clark Aldrich's Simulations and the Future of Learning. About a month later he must have stumbled across it, and he left this comment clarifying his position. I received an email alerting me it had been posted and was shocked to see a message from Clark Aldrich in my inbox. When I read the post, I learned of his blog, to which I have now subscribed. (He also plugged his new book, which I've now ordered.)

    Then, on Friday (three days ago) this post showed up in my aggregator (thanks to an MSN search feed I wrote that simply scans for "Mark Wagner" and "Educational Technology"). It turns out that when writing last week, I had cited a paper Susan Mernit wrote in 1995. Within 24 hours of my post, she had seen what I wrote and responded to my writing on her own blog! By Sunday night, her writing (and encouragement) had come to my attention in my aggregator.

    As a student and teacher of literature and philosophy I have often talked about "the great conversation" between authors of different ages. Unfortunately this conversation has often only been one way, since the response to an initial work often comes after the initial author can reply - or at least after they are able to publish another work in reply. With blogs though, this can happen over a matter of months, days, or even minutes. I am honored to be a part of the conversation... and no less so because I've yet to be "published" in the traditional sense.

    To be sure I'm connecting the dots for you here... students can have this same experience, as Will Richardson's students have, and as Sheri Bithell's students have, just to name a few examples.

    BTW, I really enjoyed sharing these stories with the cohort of principals I was training today in OUSD. And they didn't even make fun of my hat. :)



    While I was busy writing a dry critique of a Moodle site, one of my classmates wrote about the interactivity of Amazon. I merely followed his lead with these thoughts here...

    Some, but not all, of the dynamic features of the Amazon.com website are:
    1) It tracks visitor activities in real time and reacts to those activities by creating buying recommendations based on the recorded activities.
    2) The site greets each registered visitor by name and customizes the welcome page, recommendations page, and other features throughout the site based on user preferences.
    3) The site allows for direct user input to tweak its recommendations and the like.


    At the Orange County Department of Education, we have talked a lot recently about what it would take to implement a learning portal modeled after the success of amazon.com!

    I am sure this is what you were getting at, but imagine an interactive site that include features such as this:

    1) It tracks student activities in real time and reacts to those activities by creating learning recommendations based on the recorded activities.
    2) The site greets each student by name and customizes the welcome page, recommendations page, and other features throughout the site based on student preferences.
    3) The site allows for direct student input to tweak its recommendations and the like.

    That was just plain fun to type!

    We have talked about a page that would show the student (or staff member visiting our site for professional development purposes), all of the upcoming face to face classes, custom trainings, videoconferences, webcasts, and online courses that relate to their interests (such as content areas standards etc). It would of course also include links to archives of any past events... and to courses etc. that they are participating in at the time.

    As I explore the possibilities of a future full of interactive games and simulations as learning environments, I can also imagine a portal that includes links to the latest games and simulations related to student interests. Imagine... Others who have played "The American Revolutionary War" might also enjoy playing "The French Revolution", "Cinco de Mayo", or "The Gaza Strip"... or something like that. ;)

    This would very much capitalize on your ideas...

    Imagine a web tutor that could track student habits and adjust the lessons based on their habits—capitalizing on strengths and also addressing weaknesses (detected by user input or activities which are avoided.) The model provided by Amazon.com suggests that such a site is in the realm of possibility.

    I also appreciated Lauretta's suggestion that "They often don't remember from one class to the next what they had been working on, so having the computer 'remember' would be very helpful!" Students could always pick up where they left off... or even return to a previous "save point" to pick up the momentum of the lesson, game, or simulation.

    I'm glad you decided to review an seemingly unlikely site.


    Learning to Compute, and Computing to Learn

    My reaction to this simple statement by a classmate lead me to crystalize a thought I should have stumbled on a long time ago...

    Making the assignment personal and asking students to critiquie and/or apply new learning is a good start and can inhibit plagiarism while improving deep learning (Snyder, 2005).

    Thank you for this reference, Mia. This is a philosophy I feel very strongly about.

    For the most part, I feel that if a student would be able to plagiarize and fulfill an assignment, then the assignment is not worth assigning in the 21st century.

    This is in contrast to some of the skill or memorization based mastery learning philosophies of the past, but is reasonable given the nearly universal access to information available online. Why should a student write a paper on the causes of WWII if they can access this information any time? Perhaps the causes should only be written about in terms of their relationship to, or application to, problems of today.

    Based on my previous post today, I suppose the major caveat here is that this philosophy may have its limitations when it comes to students who are still "learning to compute" (or "learning to read" - and write - for that matter) and who do not have the motivation for this kind of generative learning.


    Education as an Import or Export

    Here is another of my responses to a colleague at Walden...

    To the best of my knowledge, the state of Florida is also experimenting with virtual high schools. The online classes are offered for students who can't attend classes for various reasons (health, incarceration, and so on) or to offer access to subjects not offered at the local high school (Latin for example).


    This is not only true, but Florida is exporting their virtual high school courses! The Newport-Mesa Unified School District in California began an online program for high school students last year, and though they have plans to transfer to their own system sometime in the future, for their initial efforts they simply purchased access to Florida's online curriculum. They continue to do so as they expand the program in it's second year.

    Incidentally, it strikes me that as people are concerned about the United State outsourcing science and engineering jobs (in addition to other labor), and that as Entertainment continues to grow as our second greatest export, perhaps Education, too, could become a important export for our country. Currently, an increasing number of students from all over the world come to the United States for their face-to-face education and then return to their home countries to use their new skills. Walden has students from all over the world participating in distance learning. While this tendency is alarming to some, I think perhaps the perspective of looking at education as an export might be a more positive (and global) way to look at this, and it might be even a productive direction to focus our energies.


    No new innovations?

    Much of what I have written this past week is probably not worth sharing here, but perhaps this is. A classmate of mine, a recently retired school district IT director, actually wrote the sentence below. My reaction follows.

    There are no new innovations I would like to see emerge.

    This was shocking to read, but I think I see where you are coming from. I'm presuming that you don't mean you don't want to see any new innovations, but that there are no particular innovations you are looking forward to. I wonder if I am right about this.

    I know what you mean when you say that the technology industry is always rolling-out innovations yet educational institutions systems have no constructs in their processes to handle change as quickly as the industry is demanding. It is difficult to watch school districts exciting new programs, only to see the programs fail because previous technologies had not been implemented or maintained properly.

    For instance, with the handheld grant I managed, the handhelds made one to one computing a reality at two middle schools (a situation with many obvious advantages, not the least of which was the ability for every student to be able word process anytime anywhere), yet the program was plagued by technical frustrations because in order to use the devices to their full potential (and to print out student writing for teachers to grade - since teachers were not comfortable with grading on a handheld), the handhelds had to synchronize with the school's desktop computers. Unfortunately the school networks and computers were already sorely neglected, and without getting into all the details, it will suffice to say that synchronizing was always a major issue (and printing was even worse).

    I did work to upgrade the computers and network, which helped, but this generated professional development needs and a host of other compatibility issues. (At the time, we moved from Mac OS 9 to the entirely new Mac OS X.) And as much as I have enjoyed apples new release early and often philosophy personally, it is frustrating to see that all that work is already out of date again.

    Still, even though I understand the system wide ramifications of adopting new innovations, especially on a large scale, I would rather see our organizations adapt so that they can more readily adopt beneficial technologies than to see them resist new innovations. To do this, we will have to part with some of our traditional (and competing) ideas about school.

    In this respect I agree with some of your other points, such as the pressure is on the teachers to catch up with the students in their comfort level with technology and to the industry demand for quality graduating students with the ability to handle the technology... and such as your final sentence:

    The innovation, which would be the most benefit to education, is to solve the equity and access issues, address the total cost of ownership that comes along with such applications, and examination and renovation of the testing requirements for measuring student achievement.

    I would add that when we address these issues we do so with the goal of establishing a plan for systematic renewal, such that unforeseen innovations can be easily and efficiently integrated into teaching and learning where appropriate.


    Thursday, May 05, 2005

    Podcasting... and the new role of face to face instruction

    My reply to a professor's response to my post last night...

    Thanks for your response, Dr. Hazari. It's exciting to hear that you're exploring podcasting!

    Lectures can be podcast and archives made available to students not only from professors but also guest lecturers. This actively forces students to use the technology not only for specific disciplines such as Music but in any field.

    I agree, of course. And imagine how much more powerful this becomes when students are podcasting their work, or their reactions to class sessions (since I'll presume the cutting edge faculty won't be delivering lectures). Elements of good PBL and good design could be integrated as well, as students work independently or in groups to record, edit, and produce quality podcasts. When I first got into it, I recorded and posted a few on my blog as well. It is a time consuming process to create something that truly takes advantage of the audio medium (as opposed to simply reading what I've written), but making decisions about what media to use - and how to best use various media - to get their message across will help students develop important 21st century critical skills.

    On the other hand, a pessimist would argue that it gives one more reason for students to stay away from classes knowing material (in audio or video format) is available for perusal later :(

    My feeling is that if this is the case, then classes need to improve or else go unattended.

    I hope we do not limit students to outdated modes of learning simply because of our own biases or associations. I do not believe there is any inherent educational value to "going to class" (or not). I have learned, and not learned, in both ways. My brother, an actor and philosopher with whom I have great conversations about educational technology, uses web based training videos to learn things such as how to use new software and web programming skills, but he is very concerned with using face to face time to do things that can only be done with the high-bandwidth of face to face communication and with bodies physically in the same space. (Not surprisingly, he also believes that audiences should participate in performance art.)

    I no longer think that a face to face meeting is at all appropriate for mere information transmission - I am now offended when asked to come to a meeting or class to listen to someone (or several people) speak. I expect the attendees to be tapped or asked to contribute in some creative way - and if I am not, then I expect the courtesy of an email (or other electronic communication) that I can consume, reflect on, and respond to at my own convenience.

    I know this is primarily my opinion, but there you have it folks. Your students may not be able to articulate their feelings as well, but many of them feel this way. We as educators must adapt.

    Incidentally, Walden, too, must adapt. The residencies are currently in a time of transition, and a time of finding a new identity. It is no longer appropriate to bring students together to simply listen to faculty. Over the past year Dr. Brigham and others have worked to make the residency experience more interactive... and to take advantage of the resources available when they bring hundreds of phd students from all over the world together in the same space! I am excited by many of the new ideas that were discussed at the Tampa residency and look forward to seeing what Walden residencies become when information transmission is handled online and people are able to create together face to face.

    Of course, as video conferencing becomes increasingly affordable, the need for face to face classes to include physical and bodily components will increase. This will be interesting...


    iPod and Life

    I figured if I was going to email this to Jason Ediger, I might as well just post it here for everyone.

    I commuted to Oakland today (for the conference planning committee meeting of the Computer Using Educators), and I've been doing a lot of flying these past two years. It was not so long ago that I remember flight attendants struggling to decide whether to say "walkmen" or "CD players" when instructing passengers to turn off all electronics equipment before take off... but apparently this problem is settled now. The attendant on both my am and pm flight said "all electronic devices including iPods."


    PS. No cause for alarm... blogger's spell check still doesn't recognize the word iPod. It would prefer "wiped"! Then again, it prefers "welcoming" to "walkman" and "blockers" to "blogger".

    Innovations, Past and Future

    Written for class, of course, but also one of my favorite class posts in a while:

    It probably won't surprise you that the following is the question I'm most interested in addressing...

    What innovations in the Internet or technology would you like to see emerge and how will they fit into education?

    I'd like to begin by reflecting on some of the web sites from this week's reading which predicted innovations related to those that have recently captured my interest.

    Riel (1995) wrote:

    "Current developments in communication technology now provide new options for students to extend themselves across distances and through time. This technology invites children to leap off the "shoulders of giants" onto satellites and use this global perspective to participate in new ways with their peers and other experts in distant locations. It is possible that these experiences will help make the power or the written word more apparent to new generation of citizens"

    The text sounds prescient, yet she writes mostly about email in her examples, and though she does mention message boards, it is not surprising that she failed to imagine the power of blogs and the read/write web to transform student learning. She does write of "a complex partnership among the student, parents, teachers, schools, communities, districts, states, regions, and nations," but could she possibly have imagined something such as this parent child blog (including video conferencing with the author) when she wrote her article? And although she writes that "computer and printer technology made it possible for people other than parents and teachers to easily read the products of young writers," the mention of the printer seems anachronistic now, and again I can't imagine she knew how true this would be given the millions of youth blogging (or journaling) online ten years later. Still, she understood that student writing could contribute to the community and that conversely, the community would come into the classroom - something that might not always be good. Did she imagine that cities like Fullerton in Orange County would embark on projects that would provide city wide wireless internet access, which would in turn allow high school students unfiltered access to the internet while on school grounds (and using school-owned wireless computers)?

    Hunter and Richards (1995) also sounded prescient, with their talk of "distributive collaborative inquiry." They certainly were in a way, but when they wrote of "Each school contribut[ing] to a collective database that includes information from many different geographic locations" they did not foresee the relative ease of student publishing in 2005, when schools and students can publish their own work online with the click of a submit button. This makes truly distributive knowledge possible, and collaboration is further facilitated by the comment features of most read/write web applications - and the use of Wikis in education allows truly collaborative authoring and editing.

    Mernit (1995), too, mentioned publishing, but this was at a time when she was amazed to announce that there were 1300 educational websites available; a Google search today for the phrase "educational web site" turns up about 286,000,000 hits! At the time Mernit was writing "Only one-fifth of one percent (0.2 percent) of the approximately 100,000 K-12 schools in the United States [had] enough network access to develop their own Web sites" - now such access is ubiquitous and almost universal. (California, for instance, has 73% of it's schools not only connected to the internet, but to a high speed broadband network.) Mernit's projections about where WWW publishing was going in 1995 seem spot on, in spirit, especially the suggestion that "the focus on multimedia and interactivity will increase" - even if she did not specifically foresee the read/write web that students have access to today.

    It's wonderful that the truth of 2005 has turned out to be so much more amazing than what these writers could ever have predicted in 1995, the year Netscape was publicly released.

    As for what I would like to see emerge, I've never been asked an easier question at Walden. With text based blogs already graduating to visual and audio content (consider flickr.net and ipodder.org respectively), and with vodcasting (video on demand casting) already here, what I look forward to is students creating more and better multimedia content to contribute to the community through their own blogs, podcasts, and vodcasts. As students acquire skills such as those taught in the ACME animation project and being developed at the Education Arcade, they will be well suited to contribute to the entertainment industry (the United States' second biggest export industry). In education, I envision a profusion of student made games (since, as Professor Seymour Papert, 1980, says, if students are to play video games, they should create them), created using an open source gamemaster's kit not unlike the one included with Neverwinter Nights, which is already being used at MIT to develop Revolution. I envision advanced students creating more of the simulated online worlds, similar to the way the 'jedi' in the MUDs of the early nineties would write levels for other players. This will be project and problem based learning for the designers, and context-embedded, inquiry-driven, and socially negotiated learning for the players of the multiplayer online role playing games.

    I could write all night about this, but will get this posted before midnight instead.

    Note, thought, that my interest in making projections about the future, and in affecting the future is even leading me to explore the delphi method for use in my dissertation.



    Hunter, B., Richards, J. (1995). Learner contributions to knowledge community and learning. The Future of Networking Technologies for Learning. The U.S. Department of Education. Available http://www.ed.gov/Technology/Futures/hunter.html#RTFToC6

    Imperial County Office of Education. (2005). Program FAQ. K-12 High Speed Network. Available http://www.k12hsn.org/about/faq.php

    Mernit, S. (1995) Publishing on the WWW: what's happening today and what may happen in the future. The Future of Networking Technologies for Learning. The U.S. Department of Education. Available http://www.ed.gov/Technology/Futures/mernit.html

    Papert, S. (1980). Mindstorms: children, computers, and powerful ideas. New York:
    Basic Books, a Division of Harper Collins Publishers Inc.

    Riel, M. (1995) The Internet and the humanitites: the human side of networking. The Future of Networking Technologies for Learning. The U.S. Department of Education. Available http://www.ed.gov/Technology/Futures/riel.html