Tuesday, January 26, 2010

Brainstorm

If the question we ask is: What criteria do we use to measure student learning?

Students can learn in various ways; we just have to figure out how to measure that learning.

Town Hall Meeting
  • New Technology. Not gonna be much of a change. Same process, but electronic. VS. In order to achieve the benefits of online benefits, there's a certain cost associated with it. Standardize the criteria. Here's the person in charge, here's his contact info.

Problems to Solve

The Open learning network. What is the origin of learning, networking?

Things we've learned so far:
  • The definition of "tutor" matters
  • Mastery learning = next-best thing? Critiques, control groups, tests
  • Constructivist environment may be what is working in classes
  • Frequency of feedback
  • Design-based research:
  • Situated cognition
  • "Tutoring" happens via tests
  • Human potential
Blackboard Analysis project
Variables of learning technology? Keep Blackboard? Keep something else?

Blackboard used for communication, not for teaching.

I always have to struggle with the principles before I get to the answers to the questions people are really asking. I don't see any problem with having some people using Blackboard and some classes not using Blackboard. I realize it's expensive, but no cost should be too great for quality education, even if that means everyone uses different medium. There is no one right answer.

Now, knowing that, and guessing that everybody else knows that too, how to maximize use of Blackboard?

Are the tools we're providing helping the institution accomplish the Aims of a BYU Education?
  • Spiritually Strengthening
  • Academically Enlarging
  • Character Building
  • Lifelong Learning and Service
What kinds of experiences can we have through these tools that can help fulfill the aims?

With the Vocal Performance Database, how can I know that it will help do all of the above things? Is it the best tool to do the things we need it to do?

*Pretend I am the decision-maker, and I'm putting into place a tool set for education. What kinds of activities would it facilitate? Is it a tool, or is it driving common practice? A Teaching and Learning Tool for BYU.

The implementation of a new technology. What are the things i need to be prepared to explain? What problem are we trying to solve through using this tool?

Tuesday, January 19, 2010

Class Notes

Haiti complains that help isn't arriving fast enough. This is the same complaint of the 2-Sigma Problem. Students could be performing so much better if we were to fix the system, but we're not.

Google Docs - Research papers collected from other classmates

John Chapman- 2-Sigma Solved! by Corbett. Well, okay, just 1.7. Repeatability of studies. No one can repeat the study.

Jennifer Berry- Why does the 2-Sigma Problem happen in the first place.

Anneke Majors- It may be the actual interaction--the constructive process-- that is actually the key to re-creating the 2-Sigma problem.

D&C 88:118- Teach one another.

Anne Makin- Group learning may close the gap. Problem-based learning allows students to learn together and teach one another.

Michael Bush- Originality of the word "lecture" from French lecturne... Teachers read, students wrote and copied. The fact that we're still doing lectures is depressing. Technology can't solve the 2-Sigma Problem. But we could use them more effectively to close the gap.

Is solving the 2-Sigma problem our goal? We might push the possibility of its impossibility.

Thomas Edison, "We now know a thousand ways not to build a light bulb."

David Wiley- Halo- complexity of controls. More complex than most things students learn in school, and yet they master it.
  • Tutoring = mean feedback events per unit of time. In class, this probably means less than one feedback event per hour. When playing Halo, it's probably 120/minute. The whole thing is about generating feedback events.
  • Time on task. Spending more time on the content = better performance.
How do we create more environments for providing feedback more often during classes?

What is feedback, spiritually?

Are students vegging out, or do they have what they need to interact with the material being presented? Interaction = You initiate some kind of communication, environment acting back. When you say no texting, no passing notes, stop talking, you're cutting off all interaction that allows you to get feedback.

Spray and Pray. iClickers- it can change the dynamic by being interactive.

Reading notes

Cognitive Computer Tutors: Solving the Two-Sigma Problem, Albert Corbett
http://www.springerlink.com/content/h6wvchfgg0u9pdjl/fulltext.pdf
  • Solving the 2-Sigma Problem with mechanized individual feedback and mastery learning in the context of a cognitive model of problem solving.
  • Cognitive tutors have closed the gap and maybe even surpass human tutors.
  • Cognitive tutors: employ a cognitive model of student knowledge for individualized feedback and mastery learning in the ACT Programming Tutor.

Rethinking Mastery Learning Reconsidered, T.R. Guskey, 1987.
http://www.jstor.org/stable/1170237?cookieSet=1
  • Pairing educational interventions may lead us to better results.

Tuesday, January 12, 2010

IP&T 682- The 2 Sigma Problem

Bloom, B. (1984). “The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring,” Educational Researcher, 13:6(4-16).

Mott, J. and Wiley, D. (2009). “Open for Learning: The CMS and the Open Learning Network,” in education, 15:2.

Thoughts from the readings:
  • Changing one variable changes performance. We're not tapping into potential.
  • Teacher attitude - teachers not confident in student abilities. Excellent teachers take personal responsibility for the success of their students. Is excellence the goal?
  • Reinventing the past... why no progress?
  • Regression to what's easy/practical - creativity, more time-consuming, but most powerful. BECOME.
  • Engaging Learning- "Hard Fun." The human organ is a learning machine. Something happens b/w Kindergarten and 4th Grade, making learning "hard," "painful," "arduous," or "frustrating." There is no intrinsic reward for learning.
  • Econ 110- "Satisficing"- Utility maximization- you will do as much as possible to get as much utility as possible. Butt you will also conserve your resources so you can produce more.
  • Prisoners of Time- You can hold time constant and let learning vary. (K-12) Hold learning constant and let timing vary (Montessori)
  • decliningbydegrees.org
  • Juniors and Seniors doing better in Freshman-level courses than Freshman: Enculturing oneself into the BYU way of learning - Learn how to survive "the system." Very un-even success strategies for that class.
  • The Five-Minute University
* Find at least two articles that reference the Bloom 2 Sigma piece.

Guskey, T. R. (1987). Rethinking Mastery Learning Reconsidered. Review of Educational Research, 57(2), 225-229. Retrieved January 13, 2010, from http://www.jstor.org/stable/1170237.

http://www.springerlink.com/content/h6wvchfgg0u9pdjl/fulltext.pdf



Thursday, January 7, 2010

And All of the Children are Above Average

Criterion Reference Test- Specific performance that was demonstrated.

Norm Reference Test- A test interpreted in terms of the relative position held in some known group.

Formative- Forms where you're going to go with the teaching. Assessment at the beginning of class could help you be a better teacher.

Elementary Secondary Education Act (1965)
  • Expanded the use of standardized testing
  • Accountability expectations not met

Tuesday, January 5, 2010

Oppression... I mean Assessment.

Welcome to IP&T 652: Assessing Learning Outcomes. I have a hard time with this class already. To me, assessment = standardization = unfair judgments = I'm not going to measure-up in this class. (Wait- I suppose that's an unfair assessment in itself! Such recursion.) Why do we even teach this stuff? Why can't we learn about stuff that is good and that works instead?

Measurement
- objective observation
"The process of assigning numbers that estimate the degree to which a person or object possesses some specified property, attribute, or characteristic."
  • Estimating a degree - "Let's see how you measure-up."
  • IQ Test- have to know the scale to interpret the data.
  • Assumption- if they get something right, they would also get the easier items right.
  • NOIR- Nominal (Categorical, more of a label/description), Ordinal (Ranking), I.., Ratio Data
  • Measurement is NOT classification (no nominal data). So for hair, measurement is the "Level 12," not brown, blond, or red.
  • Objects have characteristics, but it is the properties of the object are what are measured. So we don't measure Sara, we measure how tall Sara is.
  • Sometimes intelligence is measured, some who measure
  • Measurement Process assumes that the property being measured exists in varying degrees or amounts. (issues of state or trait) For example, happiness a STATE. Our fluxuation of happiness is varied, by degree, but dependent on things. TRAITS are more PERMANENT.
  • Measure vs. Categorize
  • Measurement procedures are often indirect
  • Measurement is a human activity
  • Measurement ALWAYS contains some error.
  • http://www.akdn.org/india_environmental.asp - Women in India not attending schools b/c of lack of private bathrooms
  • Quote by Joseph Smith- all things are Spirit but vary by degrees. The Lord measures us: sin, trust in Him, knowledge... Maybe there is more to measurement that could even help me better understand the Atonement.
Reasons to Measure:
  1. Compare
    "Spencer is 6'2". Aaron is 5'2"." -- People can understand about how tall Spencer is, and they can know that Aaron is shorter than Spencer.
  2. Communicate
    "Spencer is 6'2"." -- People can understand about how tall Spencer is.

Assessment - Making a judgment regarding a person's outcome
"A systematic process of gathering measurement data and turning it into usable information for a specific purpose."
  • What should a 3rd-grade student learn? At what level should they perform? (This would require evaluation component of Assessment)
  • Assessment is missing a component of considering the goodness of data. It can't really determine whether that performance was good or bad.
  • Assessment involves measurement.
  • Tests are a particular type of assessment.
  • A comprehensive assessment involves a set of planned measurements designed to provide evidence or information about a specific person or thing.
  • Assessment is as factual and as objective as it can be.

Evaluation - More thorough investigation of something
"The process of making a value judgment about the person or object based on some set of assessment data."
  • Involves judgment of merit and worth.
  • By nature somewhat subjective.
  • Usually based on a set of criteria (what we value).
  • For consistency, evaluation criteria should be clearly defined and generally agreed upon. (We try for that, at least.)
  • In graduate school, what should an A represent? Some degree of excellence? No truly shared view of what these really are.
  • The Lord evaluates us and takes into consideration opportunity and circumstance.
Next time: Policy Trends and Assessment