Wednesday, December 2, 2009

Dec 02

Data analysis is a skill but it can me learned.

Why do we report?
  • accountability
  • organization evidence for decision makers (use headings). Organize data by question/topic not by instrument
There is no standard format for reporting except if there is.

If you are giving bad news there is a right and wrong way to do it. (Max Hall)

Sometime you need to write tailored reports for different stakeholders.

Your results will sometimes be mixed with the discussion.

The introduction and the conclusion are the most important. Make sure they are well polished.

Tuesday, December 1, 2009

Nov 20

He had a good table that had the following on the top:

Evaluation question
Which stakeholders?
What methods?
Rationale?
Timeline


If it is a true or false type of question then it is probably a qualitative question.

There also needs to be some sort of credibility checks on the methods.

It was also really interesting to hear all of the ways that his EMATH evaluation went wrong. It just emphasized how you need to be be flexible.

Monday, November 30, 2009

Nov 30

The evaluation template is a guideline. Every one will do it differently. The key is to make it clear and let the client know that you can do it. It is also important that the evaluator does the proposal the way that the client wants it done. Evaluators need to keep the client involved. We are reaching the end of the semester to we should not delay the day of our repentance.

Tuesday, November 24, 2009

Nov 24

Qualitative

Coding Systems
  • A Priori-when you develop the codes before
  • Inductive-is when you create the code as you go
  • Then you have to look at the inter rater reliability

Enumerate or quantify the codes so that you can begin to see patterns

Negative Case Sampling is when you go those who don't agree with the majority.

Monday, November 23, 2009

Nov 23

Organizing and Collecting the Information

the thing that is done the worst is the data analysis. It doesn't have to be long but it does have to be intelligent.

State and Trait--a trait is more consistent We want to find traits not states. In order to do it we have to have consistent observation.

Descriptive statistics--are used to communicate the essential characteristics of a data set. (20% of students are female (N=20))

Inferential statistics--go beyond the data to infer to a population based on sample data.

Look up the following terms:

Nominal
ordinal
ratio

Wednesday, November 18, 2009

Nov. 18

There is no correct way to do data collection.

You need to focus your data collection on your questions. Then we start the collection process.

Although you need to look at the literature you don't need to do a literature review.

In the Methods you will describe what data collection is needed, how it will be organized and analyzed, how are you going to report the decisions, and management plan (timelines and budget).

Just because you want to do something doesn't mean that you can do it or afford it.

I need to start answering these questions:
What data is needed to answer the questions?
What are the sources of this information?
What is the best way to get the required information? Who will get it?
Are there any restrictions of challenges involved?
Ask the right people the right question!

Monday, November 2, 2009

Nov. 2

selecting evaluation questions/criteria

the questions should be clear and it will be the foundation for the evaluation
The quality of the evaluation depends on how well use methods to answer the questions

Research questions are not evaluation questions
  • research questions are more spicific
  • research questions are typically How, What, and Why questions
  • research questions involves a stated hypothesis
  • evaluation questions answers the "should" questions
  • evaluation are based on making decisions (what are the strengths and weaknesses of the program? What resources are needed?)
  • neither can truly show "cause and effect"
I should have a purpose statement and then some questions. (Look at the PowerPoint presentation for some examples of purpose statements and questions.)

Next time I should have a divergent phase done.

Monday, October 26, 2009

October 26

In the pre-evaluation phase is important
  • Define the evaluand
  • See if the evaluand even exists
  • Do a needs assessment
  • decide whether or not to proceed
  • define the stakeholders
  • look a the success indicators
This needs to be done to save you headaches.

Friday, October 23, 2009

October 23

What is my evaluation proposal going to be on? Blended learning? Video Communication in an online environment? Who are the stakeholders going to be?

Pre Evaluation Stage

Evaluations should not be cast into a single mold. For any evaluation, many good designs can be proposed, but no perfect ones. (Conbach)

Here is the things that we need to do for the proposal:

Purpose
  • Questions and Criteria - Success indicators for judging the merit and worth of evaluated
  • Proposed Evaluation activities (data)
  • Detail how the evaluation will be conduced
  • Timeline and budget
What is the purpose of this Pre-evaluation phase?
  • evaluability assessment--can the object be assessed
  • determine a budget
  • describe the program and understand it
  • links program goals with program activities
  • determines whether the program is ready for a formal evaluation
  • helps focus the evaluation
Questions that I should ask?
  • What is to be evaluated? (Logic models: goals, process, activities)
  • Why is the evaluation needed? Intended purpose
  • Who needs it done? What do they want to know about it?
  • What conditions might contrain the evaluation?
  • pg 186 has questions you should ask to see if it can and should be done
Why should it not be done?
  • it is premature
  • the purpose is inappropriate (to delay or posture)
  • Resources are inadequate
What do I need to do?
  • Identify key stakeholders and audience (what are their concerns)
  • determine availability of resource
  • Identify Limitations/REstructions
  • Discuss potential evaluation approaches
  • Discuss budget
  • Establish communication expectationa
  • Evaluate political context
  • Timeline and budget

Monday, October 12, 2009

Participatory Evaluation

Relies on inductive reasoning
Uses several sources of data (this is on the constructivist side)
Is pragmatic and is not concerned so much with theory
you start where the participants are so there is no set steps that you follow

The evaluator needs to teach the clients how to evaluate
They act as coaches in that they help people take change.

This is a very bottom up approach.

Strength
  • Broad scope
  • willingness to handle complexity
  • Flexibility (no arrows)
  • Attention to contextual variables
  • Reflection on nuances surrounding the program
  • Fosters activism
Weakness
  • Willingness to look at complexity
  • Cost
  • Time
  • Bias
Responsive Evaluation

It is responsive to the realities in the program

They interact continually with the stakeholders

Create narratives or product displays

Share case studies

Wednesday, October 7, 2009

October 7th

Expertise-Oriented Evaluation

This is one of the oldest forms of evaluation. We can see this with guilds during the middle ages.

Expertise used to judge an institution program, product, of activity.
Many times a team of experts would be used to evaluate the different parts of the design. The example used was that there are a team of umpires but they are each responsible to only specific things. An expert just means that you know how to evaluate not necessarily perform the task (Olympic judge).

It goes through systems:
  • Formal Review Systems (Accreditation) This is normally not based on learning outcomes but what the expert sees
  • Informal Review Systems
  • Peer Reviews for Journals
  • Ad Hoc Panel Review--funding agency review panels such as the United Way Funding
  • Ad Hoc Individual Review--Consultants
Limitations--evaluations may just reflect personal bias
Strength--emphasize role of expertise and human wisdom in evaluation

Utilization-Focused Evaluation

A good evaluation is an evaluation that is used.

The role of the evaluator:
  1. Identify intended users
  2. Engender commitment/increase "readiness for evaluation" (this is the foundation)
  3. Help users generate own questions
  4. Carry out evaluation working closely with users throughout process

Monday, October 5, 2009

October 5

Consumer Based Evaluation
Listening to a Sales Rep.
  • It can be biased
  • Alternative motivation
  • assumptions
  • What is the criteria
Listening to the State Evaluation
  • Not really an evaluations
  • How was the evaluation done
  • Was learning the criteria or is it about money or back room relationships
Teacher Development Instructor
  • Lack of creativity
These evaluations are to help consumers make the correct decisions.
What information would be helpful to help consumers?
  • Cost
  • Durability
  • Is there a need and if there is does this meet the need?
  • other supporting cost
  • Accuracy of the tool
  • Does it support the end goal (perhaps standardized tests)
Many school districts do not look at all of these things but just focus on the standards.

There is a checklist of what consumers should look at on page 105.

Micheal Scrivin and Goal-Free Evaluation

When the evaluator shows up the goals are not given to the evaluator. They then observe the company and then attempts to try to deduct the goals. This lets the evaluator discover some unintended goals. Another advantage to this is that you can switch to a goal evaluation. It is also less susceptible to bias. They interact with the staff but they do not focus on certain things but do a holistic observation. They also are able to find the population that is actually impacted rather than who it is supposed to impact.

Wednesday, September 30, 2009

September 30

Management-Oriented Evaluation Approaches (Ch. 5)
This is used to help management to make decisions.
They identified the decisions that have to be made. However, you have to know your goals before using this approach.

Steps that need to be made
1. Identify the level or scale to be served.
2. For each level to be served identity the decision situations to be served
3. Criteria (variables)
4. Collection-- Then you need to collect information about decision alternatives. consider source, instrument, sampling procedure, schedule.
5. Organize, analyze, and report

This model really seems political.

CIPP Model (based on management-oriented evaluation)
Decision evaluated approach and its goal is to help management make the correct decision.

CIPP Model: helps find the kind of decision that has to be made
Context-planning objectives
Input-structuring decisions
Process-implement decisions
Product-recycle decision
If you were to do a total evaluation you would use everything but you can just do part of it.

Has a good connection with stakeholders but this can be a drawback as well.

Monday, September 28, 2009

September 28

Objective-oriented evaluation

Tyerian Evaluation Approach
Founded by Ralph W. Tyler
Evaluation as a process of deterring if objectives were being met.

7 steps:
1. Establish goals
2. Classify goals
3. Define objectives in behavioral terms
4. Find situations in which achievement of objectives can be shown
5. Measurement
6. Collect data
7. Compare performance data with behaviorally stated objectives.

Because it is so goal oriented people can have a narrow view of value. People’s definition of goals can vary and cause some problems.

Metfessel & Michael’s Evaluation Paradigm

Stresses the involvement of stakeholders and would gather data in different ways.

Provus’s Discrepancy Evaluation Model

Viewed evaluation as a continuous information management process to assist decision making process.

Logic Model

Have a long term goal and then setting the objectives to reach that goal.

Goal Free Evaluation

They try to determine the goals that people have without being told what the goals are. This is good to find unintended goals.

The strength of this approach is that it is easy to do and pretty straightforward.


Experimental Approach

This can be unpractical and even unethical. Trying to control factors is impossible. Control is an illusion. It can also be unethical because you are denying treatments or methods to groups. It is also expensive.

September 25

Psuedo Evaluations:

Public Relations Inspired - It is basically propaganda. The object is to create an evaluation specifically to make something look good or bad. It has little rigor and is not a good evaluation.

Politically Controlled - when the results are edited or withheld because of a desired perception. This happens all of the time. Unlike public relations inspired evaluation it can be a good evaluation but then it is edited as a way of misleading the public. An example of this.

Pandering - catering to the clients desire for predetermined evaluation conclusion. Really close to public relations inspired. This could be when an evaluator gives a positive evaluation in hopes of being used again.

Empowerment as Evaluation - You get someone else to do it but you take credit for it. When my cooperating teacher had me write my own letter of recommendation. Review publications will sell their top spot in a "top list."

Quasi Evaluations:

Accountability focus - limited scope/questions or simple objectives/criteria. Timeliness of results issues. Payment-by-results approaches. Preoccupation with specific outcomes. Only using test scores to evaluate the quality of the school would be an example of this.

Success Case - document successes, identify contextual factors that led to success.

Experimental Studies - Concerned with establishing cause and effect, often using only a narrow set of program factors. Naturalistic Evaluation is the opposite of this.

Management Information Systems - select limited set of variables as indicators of success. Again this is limiting the scope and time.

Cost Benefit Analysis - when are you just focused on the "bottom line"

Judicial Debate Approaches - role playing evaluators. Mock trials on the case of the evaluation.

Criticism and Connoisseurship - Experts make decision based on their experience and knowledge.

Psuedo Evaluations - consider accuracy, ethics/reporting, purpose and objective

Quisi - consider its completeness and focus.

Wednesday, September 23, 2009

September 23



You need to look at where people are coming from to understand the way people evaluate. People's views of ontology and epistemology. In ontology there is a spectrum of beliefs with realists on one end who believe in a concrete truth and on the other are the Nomilists who think that there is no truth and everything is relative.

This was a really helpful way to understand these topics. As we do our presentations we should be thinking what ontology it came from. We should also be looking at Utilitarian (Greater Good) vs. Intuitionist-Pluralist (Individual Impact).

Monday, September 21, 2009

Monday, Sept 21



It seems to me that inductive and deductive logic seem pretty clear. However there is another form of logic called Emotional Logic. When we were talking about emotional logic I thought of Steven Colbet word "truthiness." This is how he describes it:

Truthiness is tearing apart our country, and I don't mean the argument over who came up with the word…
It used to be, everyone was entitled to their own opinion, but not their own facts. But that's not the case anymore. Facts matter not at all. Perception is everything. It's certainty. People love the President because he's certain of his choices as a leader, even if the facts that back him up don't seem to exist. It's the fact that he's certain that is very appealing to a certain section of the country. I really feel a dichotomy in the American populace. What is important? What you want to be true, or what is true?…
Truthiness is 'What I say is right, and [nothing] anyone else says could possibly be true.' It's not only that I feel it to be true, but that I feel it to be true. There's not only an emotional quality, but there's a selfish quality.


There also exists a type of hierarchy of "Ways of Knowing"
1. Personal Experience (empiricism)
2. Reasoning
3. Expert Opinion or Tradition
4. Statistics
5. Pragmatism
6. Science
7. Scholarship
8. Conscience and Revelation

Why don't people use evaluation?
I was thinking that one big reason why people avoid it is because of their fear of failure.

What did you learn from the reading?
Evaluation comes form government trying to cut out the pork.