So, we were to discuss “best practices” in education this last week. Here is the list I compiled in class, to start.
Differentiated assessment
Differentiated instruction
Teaching higher-order thinking skills
Using technology
Inclusion
Diversity
Smaller class sizes, etc.
Workshop methods (which is the “in” thing for language arts instruction, currently)
Student-centered learning (as opposed to teacher-focused instruction)
Cooperation
Real life problems and practical applications
Informal assessments
Technology
Technology
Technology
Basically, “best practices” are those that are (pardon my saying it) those that are in vogue. They will vary over time and population and region. Currently, Willamette is highly interested in educating the their “teachers” to be sensitive to the needs of a diverse population who are willing and eager to implement technological activities in the traditional classroom. We (the students of this program) are encouraged to think that it is not only ideal but necessary to use technology whenever possible. This line of thinking has evolved for a number of reasons, I am sure. One is that students seem to like technology. Twenty years ago, I was delighted to play Oregon Trail and Number Munchers. A mere two decades later, we want to use “smart boards” instead of chalkboards or whiteboards and have finally stopped telling kids that they need to learn cursive because they will use it in college.
But what does that mean in terms of “best practices?” Is it because we think that students learn better when they are able to use technology to access the material? Or is it primarily because we recognize that what we have been teaching has become antiquated and that we are doing a disservice to students to teach them things that they will ONLY use as adults if they continue to be students. The five paragraph essay, for example—only necessary in the academic world. Knowing “point-of-view,” probably only necessary for literary positions. And probability (hah!) is apparently only applicable in gambling and in business and economics (in terms of how a company might use it to allocate their merchandise, or so one of my classmates claims), which means that you could simply learn those calculations when you were trained for that job. If “technology” is to be a “best practice,” then it must be something that is not only useful to students in the long scheme of things, but it has to improve their understanding of some of the things that we deem “necessary” to a public education. If our goal in public education is to make sure that students are knowledgeable in a number of areas (math, science, reading and writing, etc), then technology, as a practice, has to service those goals. Otherwise, it is time to change the practice or the goal.
“Best Practices” in any field (or in terms of academics, the content areas) are widely contentious, too. I remember in kindergarten that I was taken out of the classroom to do activities in another room by myself while the rest of the class had to learn phonics, using that “hooked on phonics” program. Obviously, that program, and others like it, are the sorts of practices that someone decided were the best for teaching students how to speak and read and write. Thus the reason “basal readers” came into existence. Someone thought it was the best way to teach literature and reading. There are still some who stand by it. Yet there are others (and more of them than the previous) who think that basal readers are boring and that they stunt the growth of most of the students. They think that learning phonics and spelling and grammar in isolation (as opposed to within some sort of meaningful context) results in nothing but disassociation and stagnation. What are their perceived “best” practices? Self-selected texts, workshop settings for writing and discussion, journal writing, and focus on strategies for self-correction and understanding rather than rote memorization. I put myself in that second category, if you so desire to know.
I am not supremely knowledgeable on changes in trends in mathematics (in terms of how it is taught). I admit that I think it looks about the same in junior high school as it does in college, at least in the classes that I have taken. That means that even over time, there are a few ideas that continue to remain predominant in the strategies employed to teach math. One is that calculations are important. Students must not only be able to do them by hand, but they must show all their work. Second is that there is some sort of “logical” order to learning it. You have to learn one thing before the other. Given all the other changes in the world (my dad used a slide rule to do calculus, I used an electronic calculator) within a 40 year period, why is it that we assume that our methodology and the “order” of learning should stay the same? People know, for instance, that if you speak to babies as if they are idiots (otherwise known as “baby talk”) then when they start speaking, they will often mimic those patterns—as if they are incapable of speaking like any adult human being. Yet, if you speak to them as if they are equals, they will often mimic the same speech pattern. Perhaps they will not be as linguistically aware (their vocabulary is still developing, as well as their competence in various contexts) as an adult, but at least you can understand what they are saying.
I don’t mean to suggest, of course, that it would be a terrific idea to stop teaching arithmetic, or that we can simply just teach whatever we feel like because there is no logical order to teaching math. Instead, it means that I think that we need to work with the system to figure out what really “has” to come first. Elementary students can understand some basic ideas of physics, so why not assume that there are other “advanced” ideas that we can throw at them and see if they stick? I would like to think, too, that we could stop teaching math as though it were an isolated task. If there are practical, realistic, real-life applications for the things that we are teaching students, why not let them know what they are? Why do we first assume that it is “too much” for them to handle at a specific point in time? I can tell you with certainty that I know very few practical applications for the math that I spent my years learning. Adding and subtracting and multiplying, of course, are easily applicable to life. Being able to calculate percentages is important (for financial applications, saving money). But I’ll bring up “probability” yet again (sorry, readers). I spent about seven months scratching my head while attempting to think of a useful (and realistic) application for probability. No, I’m sorry, I have never needed to figure out the probability of me choosing a red shirt out of my drawer at random. I have never needed to figure out my odds in a casino. When someone told me that car dealerships/manufacturers use probability to figure out how many cars to send to various markets, I was flabbergasted. Why not teach that? Sure, kids don’t necessarily yet care about cars when they are in middle school, but at least they will know that there is a useful purpose for having to calculate exact numbers for probability. Because otherwise, especially for those of us who do not have numbers dancing in our heads, probability is more or less just a “vague idea of the odds”—not a precise calculation.
I apologize to the classmate who I paraphrase above, as I do not believe I ever asked your name while we were in class having this discussion about probability. I hope I understood you correctly, and I still secretly want to ask you to show me some examples of how it works, because I am pretty impressed that probability could apply to something real.
Blogging—the new “best practice.”
No comments:
Post a Comment