Tuesday, November 16, 2010

Indexing as a Game - Part 2

Contribution: Make it possible for people to find genealogy.
  • track indexer and records and number of temple ordinances as a rewards cycle - keeps you motivated to keep going in the game. How can we potentially shorten the rewards cycle? track number of temple ordinances done from names indexed, track number of records added to pedigrees, number of search units, number of search hits (how many times the record you've indexed has been looked at by others).
  • Get people socialized quickly --> feeling of success early on, retention, obedience to rules of the world, feeling connected to the game -- do a before and after test to find out if it made a difference in numbers. Give the game a name and a face.
  • How much difference would it make to have a greeting from a stake/ward indexing leader right as soon as you go online?
Get to doing as fast as possible:
  • Rather than having to read, send administrator a note to give local training for each person (synchronous)
  • Or having a video of the administrator greet the newcomer (asynchronous)
  • Set small goals that are achievable quickly
  • Show progress, see milestones for goals--showing people how
  • different screens/lands/records to do based on level in indexing - QUEST
  • Show how well the person did in comparison with the top indexers - names per time period per hour--show measurable nrmas so people don't have to start over--show "people like you do about this rate:..."
  • To win games, the social is the most important of that stop. (Bragging rights) - measure also how many names were done in the ward by their Family people
  • Build-in functionalities or
  • Log on, see where you're at, where the ward's at, where the church is.
  • Show bar of how many records are left to index.}
How are we goign to represent these things? How could we visually represent things that have some (visually, how bad am I off?)?


Let's start to map it out.

Designing for the Generation - BYU Forum

Generation "Millenials" are:
  • SPECIAL - treat them as VIPs
  • SHELTERED - harm them not!
  • CONFIDENT - be positive
  • CONVENTIONAL - focus on family
  • TEAM-PLAYING - work together
  • ACHIEVING - national spelling bee champions
Circulation:
  1. High - Institutionalism, individual weak
  2. Awakening - Individual strong
  3. Unraveling
  4. Crisis - Challenge, opportunity, Xers will be heros, millenials will build sense of community and sense of connectedness that will build a new era
"This generation has a rendez-vous with destiny."

Friday, November 12, 2010

LDS Instructional Design Community Conference

Salt Lake City, Utah

Questions to ask about your solutions:
  • Does the solution motivate learners to change behavior? Does it "fill a bucket" or truly "light a fire" for learners?
  • Is the solution original and creative?
  • What is the estimated impact from the proposed situation?
Changing cultures to adopt new technologies??
  • Formal/informal mentoring.

Tuesday, November 9, 2010

Question for the Week:

What is the end goal of indexing, and how can that be used to motivate people to do more?

FamilySearch Indexing as a Game

Two things to look at to start with:
  1. Challenge - What you have to do to get to the end goal.
  2. Core Performance - Set of skills you have to master to win the game.
For FamilySearch, there's no save-the-world challenge here. (ironic, because family history is so important) This is a lot like the Atonement. We're trying to apply it in our lives when it is really constantly present.
  1. Challenge? How could we make this a challenge?
  2. Core Performance = Filling out the field values. This is probably not going to change unless we think of some really creative interface.
How could we make this a challenge?
  • Measure accuracy - raise a level when you match up with others. Maybe use character-recognizing software?
  • Speech to text? Speech recognition- Read it out loud, computer takes speech to text (creates abilities for new audiences)
  • Human challenge is deciphering the words.
  • Speed in indexing? (accuracy might suffer)
  • Maybe one of the goals is to become an Arbitrator.
  • Have the Arbitrator give a score to someone who indexes correctly--giving feedback to indexers
  • Synchronized
  • Doing it as a family- projecting it on the screen and doing it together
  • Quick helps - zoom in, zoom out, find similar letters on the page, putting in handwriting course alphabets into projects
  • Usually people don't read a manual before DOING the computer game. That's part of the fun--you're DOING something.
  • How do we get them right in to indexing names RIGHT NOW without having to first read instructions/do tutorials?
  • Break it up into one field at a time, make it immediately arbitrated so you get immediate feedback. Don't wait for the whole document to be finished.
  • Sometimes indexers can't finish a batch and they have to leave it and don't get back to it--> negative feedback when it is taken away from them and it's like they've wasted time.
  • Difficult for programmers is usually better for users. :)
  • Put in user profiles where each user is from, languages they're also interested in indexing
  • Like Amazon: "people who have read this book have also read this book..." :: "people who have indexed these records have also indexed..."
  • Mobile phone game?
  • Rank difficulty of batches, give permission to users as they do more and more indexing.
  • Once an arbitrator, recruiting other people to
  • Arbitrator as a trainer (master/apprentice) to be rated on apprentices' performances
  • Creating a "Guild" aspect to FamilySearch indexing??
  • Allowing people to sign up as a team that can download the SAME batches at the same time and bring people together--> instant arbitration.
  • Making indexing about social cohesion -- screen-sharing, IM, voice capabilities, allow people to talk while they're doing it, online at the same time.

Thursday, September 16, 2010

Crowd Activated Innovation

TED Talks Speaker Chris Anderson tells how online video helps spread innovative ideas.

Movie

Games in Education Notes

"The purpose of learning is to expand agency." I believe this is really true. In my life, I have seen how education opens doors to make opportunities available that I never would have dreamed of having. Just the process of getting an education has expanded my perspective of how I CAN use my agency.

The idea of intellectual Empathy is an interesting idea to me, but I think it is important. It is so necessary to be able to comprehend and appreciate others' positions without necessarily agreeing with them.

I loved the story shared in class about Joseph Smith, how after Joseph Smith's search of what to do, the Lord waits for him to come with the question he realizes he can't solve on his own; soemthing he's tried to solve, but realized he can't. THEN he comes to the Lord and he is ready for His guidance.

I also am fascinated by the idea of "pseudo-agency." I think this is what is so intriguing and yet so dangerous about games in education.

Tuesday, April 13, 2010

Other Peoples' Summaries

  1. Mastery Learning
  2. Computer-mediated instruction
  3. Pedagogical Approaches
  4. Personally tutored instruction results in better education than classroom instruction.
  5. Domain knowledge is not a significant predictor of student success.
  6. Economic feasibility- where is the line for resources to produce such outcomes become infeasible?
  7. How much does consistency and customization have to do with the mentoring experience?
  8. Focus on feedback. If feedback can be automated, then we may be closer.
  9. The impact of tutoring on the tutor and what that tells us about the mentoring process.
  10. Learner motivation. ("satisficing")
  11. Operations are within learner's ZPD
  12. Cycles of performance, assessment, adn feedback (Gusky)
  13. Learner self-regulates

Monday, April 12, 2010

IP&T 682 Course Summary

"Make sure you all bring with you a summary of the 3-4 things you would say you now know about the 2-sigma problem (or, more broadly, about how to most effectively help people learn)."

From discussing the 2-Sigma problem, I have come to a few conclusions about how to most effectively help people learn.
  1. Learning requires feedback. Students need feedback--from teachers, parents, peers, and mentors. Only feedback will tell them how to improve from their original efforts. Students need detailed feedback so they can know specifically in which areas they are understanding and which areas need a little more work.
  2. Testing is not the only feedback. The school systems have been trying to use tests to provide students with the feedback they think students need for learning. However, test creation is too flawed to tell whether or not a student actually knows the subject material. Teachers use tests to assess student learning. Mentors are attentive to a student's daily work. As a result, learning management systems that rely on testing as a method of assessing student learning cannot be the only tools given a student in order to learn from a mentoring system. Also, standardized tests do not allow for a mentoring system. They are the anti-mentor. Having standardized tests constrains teachers--and students--to a deadline and a limit for learning.
  3. Technology can provide students with more opportunities to receive and provide feedback. Twitter, Blackboard, Learning Management Systems, Personal Learning Environments, etc. allow students the opportunity to articulate the things they are learning and thinking to a mentor figure for feedback. Digitized articles are more accessible.
  4. The value of higher education is in certification and a community of practice. Students no longer need higher education for bits of information; those are already accessible via the internet. Certification may also be offered via the internet. However, a community of practice is developed as a person works among other people also interested in that academic area. Within this community of practice, students may select their own mentors.

Tuesday, April 6, 2010

Class Notes

Apple Conference Update
  • Abeline Christian University - gave out iPhones and iPods to the entire student body
  • iBYU app?
  • Technology Life Cycle - Ruben Puentedura
BYU: 30,000 students, 1,500 full-time faculty. 1:20

Elementary School: 800 students, 25 full-time faculty. 1:32

InvestTools: 40,000 active students, 100 full-time faculty: 1:400

Open High School : 130 students, 4 teachers: 1:22

1:1 ratio found in informal learning contexts: families, marriages, tutoring, learning by the Spirit...

Sooooo...
  • We need to teach people how to seek out the necessary resources to LEARN.
  • Ideas: Teams, Peer Groups, "crowd sourcing," Teams + Instructor

Part of mentoring is helping students recognize when they're in the zone of proximal development.

ZPD --> Metacognition --> Efficacy/Responsibility

Key Ideas, Commenting on the idea of the 2-Sigma Problem:
  • Performance - Did the student acquire new knowledge, skills, abilities to solve new problems?
  • Context
  • Mastery Learning
  • Zone of Proximal Development (ZPD)
  • Ratios
  • Feedback (frequency)
  • Customized Learning
  • Computer-Aided Instruction
  • Characteristics of tutors/mentors, whether that matters
  • Learning Efficacy/Responsibility
  • Love
How can we make sure each student has mastery instead of group mastery approach? --> Using computers to do that. --> Limitations of using computers. --> What can we do, using computer frameworks, to achieve mastery learning?

*Between now and next week, come prepared with 3 or 4 sentences that summarize all of these concepts. What are the 3 or 4 things you can say about the most effective way to help people learn? Let's see if they're similar.

Maybe in the future we can turn this into a thought piece about these things.

Tuesday, March 23, 2010

David Wiley as Himself

Reflections of the Environment in Memory
  • The brain uses a function to determine what to forget based on our environment.
  • Patterns of word usage in the New York Times
OpenLearning Initiative
  • Students learn effectively without an instructor
  • Students performed as well or better than students learning from traditional instruction
Open High School and 2-Sigma
  1. Information provision --> Online curriculum
  2. Question/Answer diagnostic help and support --> Data analysis (dashboard), Teacher 1-on-1

Tuesday, March 2, 2010

Research Directions

Overall, the whole 2-Sigma course, what have we learned about it, and what is the story it tells? If we were writing a lit reveiw about the 2-Sigma problem, what are some statements that would set up the questions that we would ask?
  • Tutors matter/definitions (see class Anotated Bibliography)
  • Class size/ratios
  • Feedback: Frequency, Type, Timing
  • Constructivism/Situated Cognition - Centered on students' interests; making classes more student-centered
  • Small Changes
  • Mastery Learning is possibly the next-best thing
  • Technology enabling tutoring/implementation technology in schools

This list is kind of a list of "truisms."

Help colleges redesign high-enrolling courses. Exploratory grant program in early 2000s. Grant was to improve learning while reducing costs.

Freshman Composition - Teach in sections of 23-24 or smaller because of demands of grading on teacher time. Teachers spent most of their time preparing the lectures each week and then coming to class and giving the three lectures each week. As they analyzed how they were actually spendign their time, they realized they weren't spending their time on what could be most beneficial for the students. So they took grant money to create online lectures so teachers could spend more time going over student writing. Instructors spent 30-40 min. on each of four student papers, giving feedback. Then once a semester, they had a 1 on 1 conference with each student to talk about their writing. So the new model provided the lectures and gave teachers more opportunities to work one-on-one with each student. English classes were able to eliminate 10-20% of sections offered. As part of the project, brought in external evaluators who read the paper.

Tutor Roles - Sometimes the best thing a mentor can do is not to say anything at all. Guide - a guide knows when to go in and give direction, when to let the student meddle.

Instructional Science is a Social Science. In Public Policy, we look at Public Policy issues/challenges: teenage pregnancy, infant mortality, etc. "What if we change this one variable, will that help?" We don't know, because that one change leads to other changes. We don't always know if what we're doign is making a difference. The Science of Muddling Through. In both learning and public policy, the solution that works today probably won't work tomorrow. The subject matter is complex and dynamic.

Like the process of natural selection: It gives your intelligence too much credit. Trial and error is always better than anything any single person can design. We don't have any way of knowing which technology experiment is going to work best. We have to keep trying and getting feedback on our projects to really progress. Except in Education, we don't actually know what we're evaluating. (Like Miller vs. California, we don't really know how to describe it.)

This university is not ideal, but it is, becasue it's teh best we know how to do. Government standards may keep people from experimenting --> not lots of innovation.

People who had online class probably did better because of increased time on task.

Franklin Covey- Every 2-3 min. they ask for feedback from the student. (key to making effective trainings) Increased time on task.

* Who are we doing this for? (Higher Ed? Elementary School? Middle School?) Maybe we ought to narrow it to give it more focus.

* Maybe individuals can select a focus for it. If you need to pick a focus, what would it be?

  • Mastery Learning: Anneke
  • Concept Map/How they overlap: Anne
  • Technology enabling/implementation: Jared, Jana
  • Feedback: Jenn
  • Tutors matter/Definitions: John (v. mastery)

* Frame a set of unanswered questions, of gaps in the literature--create a plan to answer these questions and fill the gaps.

Tuesday, February 23, 2010

Higher Education Personified

Who do we serve? (What do they do? Why do they come to higher ed.?)
*This is an important question. How we answer it has great implications.
  • Students (Preparing for a future life they've set for themselves; career success)
  • Employees (Gain income)
  • Society (Up to individuals who make it up)
  • Ourselves/Board (For their own purposes)
What do we provide them that they can't get anywhere else?
  • Higher Ed as a provider of bits --> We will die. We're really, really bad at this.
  • Social Life of Education - to be knowledgeable in the world, you need the social context of the information.
  • Credentials/Degree
  • Mentoring/Apprenticeships
Clay Christiansen - IBM Program competing with Harvard's MBA. They hire you to get this "education."

New hires from MBA program have to "unlearn" everything they learned. Many employers say, "We don't care about the content you acquire; we just care if you're better than your peers."

Degrees filters the hiring pool. Even if they have to re-train you, the higher education institution can tell whether or not you could handle higher education.
  • How do we do that better?
How can we tell we're doing a good job?
  • In many instances, employers don't care what the student's degree is in.
  • Do graduates get jobs?
  • Do students achieve success on those jobs?
  • Student evaluations
  • What helps us the most is not always the most pleasant experience. (So we can't just use student satisfaction.)
  • Work backwards from what we want to do. Talk to employers to find out what makes good graduates. What makes students the most useful.
Education and Ecstasy - There has to be desire to learn, and we have to feed that desire.

Washington University's Harvesting Gradebook. Peers, Teachers, Employers--increasing severity with each level of assessment.

Neumont University: Employers set the curriculum, and students come in and graduate as quickly as possible and go to really good jobs. They ONLY get Computer Science degrees there, though.

What is the best way to provide it?
What technique do you use?

What is the best way to organize?
What is our organizational structure?

Class Question: What do we want to say at the end of this semester?
  • Is it important or futile to pursue the 2-Sigma problem?
  • Why hasn't it happened yet?
  • People have given up on using people in educational solutions.
  • If you reduce knowledge to information to bits...
  • Granularized education - efficiency of reading with technology
  • People maybe focusing too much on the small pieces of education instead of the general whole.
  • List of research questions somebody should solve
Twitter up to 50 million tweets a day.

*Create a research agenda for creating a tutor-like environment for education. Where should our attention be placed? What tools are emerging that we could use to give people access to something even better than an individual tutor?


Freshman Program

In my considerations of the ideal Freshman program, I decided that people don't go to big schools for a small school experience. In other words, I think BYU has a really great Freshman program, especially through Freshman academy, allowing students who want to be involved in a smaller group to have an outlet there. BYU should not try to make everyone stay here. It is not for everybody. But those who will stay will stay because of their positive Freshman experience.

This is my opinion.

Tuesday, February 9, 2010

Freshman Mentoring

When Freshman come to BYU, they will now each get a Peer Mentor.
  • Low-tech.
  • Einstein and chauffeur. "That's so easy, my chauffeur could answer it."
  • Every single Freshman will be involved in this program
  • A mentor will attend one class/week with the freshman
Some things to recommend to help Freshmen be successful:
  • Be clear about goals: Enculturation to higher ed. and BYU, support system
  • Frequency of Feedback
  • Early warning/intervention
  • Profiling
  • Ability grouping
  • Avoiding minimal compliance
  • Matchmaking
  • FAQs
  • Synchronous Tools
  • Experienced concierge
  • Program/interest
  • small world networks

Technology for Mentoring/Collaboration:
Measurement/Evaluation:
  • drop-out rates
  • academic performance data
  • Past/Present
  • Student interviews/surveys
  • What does success look like?
  • Trust Relationship - Elder Eyring
  • Enhanced with crowd-sourced tutor

*Design my own Freshman experience. What would that look like? How would you set it up so you could tap into that learning/human potential of every one of your students?

Tuesday, February 2, 2010

Town Hall Meeting

http://blog.brainhoney.com/

There are some things you cannot completely replicate in a completely online environment. What impact would that have on learning? Do you have to be in a face-to-face environment?

There are some things you cannot completely replicate in a face-to-face environment.

What tools inside the current LMS would a tutor use to effectively guide students through a content?
  • Automate corrective feedback

Is face-to-face interaction essential to tutoring element of education? Can we do the tutoring effectively with certain online tools? Spontaneity difficult to achieve online.

"If we don't kill them, they will be successful." It's so difficult to even make it into BYU, they're going to be successful no matter what they choose. How do we know if we've added value?

"Education is largely the process by which a teacher's lecture notes get into students' notebooks."

How do we navigate a world with so much information?

A more fundamental problem: In far too many instances, we don't put the time and effort into defining success. Instead, we create goofy multiple-choice tests that don't have any direct relationship with what we were doing.

Tuesday, January 26, 2010

Brainstorm

If the question we ask is: What criteria do we use to measure student learning?

Students can learn in various ways; we just have to figure out how to measure that learning.

Town Hall Meeting
  • New Technology. Not gonna be much of a change. Same process, but electronic. VS. In order to achieve the benefits of online benefits, there's a certain cost associated with it. Standardize the criteria. Here's the person in charge, here's his contact info.

Problems to Solve

The Open learning network. What is the origin of learning, networking?

Things we've learned so far:
  • The definition of "tutor" matters
  • Mastery learning = next-best thing? Critiques, control groups, tests
  • Constructivist environment may be what is working in classes
  • Frequency of feedback
  • Design-based research:
  • Situated cognition
  • "Tutoring" happens via tests
  • Human potential
Blackboard Analysis project
Variables of learning technology? Keep Blackboard? Keep something else?

Blackboard used for communication, not for teaching.

I always have to struggle with the principles before I get to the answers to the questions people are really asking. I don't see any problem with having some people using Blackboard and some classes not using Blackboard. I realize it's expensive, but no cost should be too great for quality education, even if that means everyone uses different medium. There is no one right answer.

Now, knowing that, and guessing that everybody else knows that too, how to maximize use of Blackboard?

Are the tools we're providing helping the institution accomplish the Aims of a BYU Education?
  • Spiritually Strengthening
  • Academically Enlarging
  • Character Building
  • Lifelong Learning and Service
What kinds of experiences can we have through these tools that can help fulfill the aims?

With the Vocal Performance Database, how can I know that it will help do all of the above things? Is it the best tool to do the things we need it to do?

*Pretend I am the decision-maker, and I'm putting into place a tool set for education. What kinds of activities would it facilitate? Is it a tool, or is it driving common practice? A Teaching and Learning Tool for BYU.

The implementation of a new technology. What are the things i need to be prepared to explain? What problem are we trying to solve through using this tool?

Tuesday, January 19, 2010

Class Notes

Haiti complains that help isn't arriving fast enough. This is the same complaint of the 2-Sigma Problem. Students could be performing so much better if we were to fix the system, but we're not.

Google Docs - Research papers collected from other classmates

John Chapman- 2-Sigma Solved! by Corbett. Well, okay, just 1.7. Repeatability of studies. No one can repeat the study.

Jennifer Berry- Why does the 2-Sigma Problem happen in the first place.

Anneke Majors- It may be the actual interaction--the constructive process-- that is actually the key to re-creating the 2-Sigma problem.

D&C 88:118- Teach one another.

Anne Makin- Group learning may close the gap. Problem-based learning allows students to learn together and teach one another.

Michael Bush- Originality of the word "lecture" from French lecturne... Teachers read, students wrote and copied. The fact that we're still doing lectures is depressing. Technology can't solve the 2-Sigma Problem. But we could use them more effectively to close the gap.

Is solving the 2-Sigma problem our goal? We might push the possibility of its impossibility.

Thomas Edison, "We now know a thousand ways not to build a light bulb."

David Wiley- Halo- complexity of controls. More complex than most things students learn in school, and yet they master it.
  • Tutoring = mean feedback events per unit of time. In class, this probably means less than one feedback event per hour. When playing Halo, it's probably 120/minute. The whole thing is about generating feedback events.
  • Time on task. Spending more time on the content = better performance.
How do we create more environments for providing feedback more often during classes?

What is feedback, spiritually?

Are students vegging out, or do they have what they need to interact with the material being presented? Interaction = You initiate some kind of communication, environment acting back. When you say no texting, no passing notes, stop talking, you're cutting off all interaction that allows you to get feedback.

Spray and Pray. iClickers- it can change the dynamic by being interactive.

Reading notes

Cognitive Computer Tutors: Solving the Two-Sigma Problem, Albert Corbett
http://www.springerlink.com/content/h6wvchfgg0u9pdjl/fulltext.pdf
  • Solving the 2-Sigma Problem with mechanized individual feedback and mastery learning in the context of a cognitive model of problem solving.
  • Cognitive tutors have closed the gap and maybe even surpass human tutors.
  • Cognitive tutors: employ a cognitive model of student knowledge for individualized feedback and mastery learning in the ACT Programming Tutor.

Rethinking Mastery Learning Reconsidered, T.R. Guskey, 1987.
http://www.jstor.org/stable/1170237?cookieSet=1
  • Pairing educational interventions may lead us to better results.

Tuesday, January 12, 2010

IP&T 682- The 2 Sigma Problem

Bloom, B. (1984). “The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring,” Educational Researcher, 13:6(4-16).

Mott, J. and Wiley, D. (2009). “Open for Learning: The CMS and the Open Learning Network,” in education, 15:2.

Thoughts from the readings:
  • Changing one variable changes performance. We're not tapping into potential.
  • Teacher attitude - teachers not confident in student abilities. Excellent teachers take personal responsibility for the success of their students. Is excellence the goal?
  • Reinventing the past... why no progress?
  • Regression to what's easy/practical - creativity, more time-consuming, but most powerful. BECOME.
  • Engaging Learning- "Hard Fun." The human organ is a learning machine. Something happens b/w Kindergarten and 4th Grade, making learning "hard," "painful," "arduous," or "frustrating." There is no intrinsic reward for learning.
  • Econ 110- "Satisficing"- Utility maximization- you will do as much as possible to get as much utility as possible. Butt you will also conserve your resources so you can produce more.
  • Prisoners of Time- You can hold time constant and let learning vary. (K-12) Hold learning constant and let timing vary (Montessori)
  • decliningbydegrees.org
  • Juniors and Seniors doing better in Freshman-level courses than Freshman: Enculturing oneself into the BYU way of learning - Learn how to survive "the system." Very un-even success strategies for that class.
  • The Five-Minute University
* Find at least two articles that reference the Bloom 2 Sigma piece.

Guskey, T. R. (1987). Rethinking Mastery Learning Reconsidered. Review of Educational Research, 57(2), 225-229. Retrieved January 13, 2010, from http://www.jstor.org/stable/1170237.

http://www.springerlink.com/content/h6wvchfgg0u9pdjl/fulltext.pdf



Thursday, January 7, 2010

And All of the Children are Above Average

Criterion Reference Test- Specific performance that was demonstrated.

Norm Reference Test- A test interpreted in terms of the relative position held in some known group.

Formative- Forms where you're going to go with the teaching. Assessment at the beginning of class could help you be a better teacher.

Elementary Secondary Education Act (1965)
  • Expanded the use of standardized testing
  • Accountability expectations not met

Tuesday, January 5, 2010

Oppression... I mean Assessment.

Welcome to IP&T 652: Assessing Learning Outcomes. I have a hard time with this class already. To me, assessment = standardization = unfair judgments = I'm not going to measure-up in this class. (Wait- I suppose that's an unfair assessment in itself! Such recursion.) Why do we even teach this stuff? Why can't we learn about stuff that is good and that works instead?

Measurement
- objective observation
"The process of assigning numbers that estimate the degree to which a person or object possesses some specified property, attribute, or characteristic."
  • Estimating a degree - "Let's see how you measure-up."
  • IQ Test- have to know the scale to interpret the data.
  • Assumption- if they get something right, they would also get the easier items right.
  • NOIR- Nominal (Categorical, more of a label/description), Ordinal (Ranking), I.., Ratio Data
  • Measurement is NOT classification (no nominal data). So for hair, measurement is the "Level 12," not brown, blond, or red.
  • Objects have characteristics, but it is the properties of the object are what are measured. So we don't measure Sara, we measure how tall Sara is.
  • Sometimes intelligence is measured, some who measure
  • Measurement Process assumes that the property being measured exists in varying degrees or amounts. (issues of state or trait) For example, happiness a STATE. Our fluxuation of happiness is varied, by degree, but dependent on things. TRAITS are more PERMANENT.
  • Measure vs. Categorize
  • Measurement procedures are often indirect
  • Measurement is a human activity
  • Measurement ALWAYS contains some error.
  • http://www.akdn.org/india_environmental.asp - Women in India not attending schools b/c of lack of private bathrooms
  • Quote by Joseph Smith- all things are Spirit but vary by degrees. The Lord measures us: sin, trust in Him, knowledge... Maybe there is more to measurement that could even help me better understand the Atonement.
Reasons to Measure:
  1. Compare
    "Spencer is 6'2". Aaron is 5'2"." -- People can understand about how tall Spencer is, and they can know that Aaron is shorter than Spencer.
  2. Communicate
    "Spencer is 6'2"." -- People can understand about how tall Spencer is.

Assessment - Making a judgment regarding a person's outcome
"A systematic process of gathering measurement data and turning it into usable information for a specific purpose."
  • What should a 3rd-grade student learn? At what level should they perform? (This would require evaluation component of Assessment)
  • Assessment is missing a component of considering the goodness of data. It can't really determine whether that performance was good or bad.
  • Assessment involves measurement.
  • Tests are a particular type of assessment.
  • A comprehensive assessment involves a set of planned measurements designed to provide evidence or information about a specific person or thing.
  • Assessment is as factual and as objective as it can be.

Evaluation - More thorough investigation of something
"The process of making a value judgment about the person or object based on some set of assessment data."
  • Involves judgment of merit and worth.
  • By nature somewhat subjective.
  • Usually based on a set of criteria (what we value).
  • For consistency, evaluation criteria should be clearly defined and generally agreed upon. (We try for that, at least.)
  • In graduate school, what should an A represent? Some degree of excellence? No truly shared view of what these really are.
  • The Lord evaluates us and takes into consideration opportunity and circumstance.
Next time: Policy Trends and Assessment