Category Archives: Common Core Standards

Ohio’s almost C grade = Illinois’ F grade on Common Core English tests

Between 33 to 38 percent of third through eighth graders in Illinois “met expectations” or “exceeded expectations” on the language arts part of the 2015 Common Core test. So in Illinois about one-third passed.

Across the border in Ohio, students who took the exact same test.  About 40 percent of them scored at the same levels as in Illinois. However, Ohio is saying that an acceptable score is lower than in Illinois. Using Ohio’s terminology, 60 or more percent of students scored at “proficient” levels. So in Ohio about two-thirds passed.

chart showing state results on PARCC LA tests taken in 2015


What gives? In Ohio if a student is approaching a C score on the test, he passes. In Illinois, if he has the same score, he fails.

A failing score means a student is not ready for the next grade, and if he continues at the same rate, he will not be prepared to enter college or a career.

The test maker, Partnership for Assessment of Readiness for College and Careers (PARCC), did not identify what score it considers to be a minimally passing score. It left that to the states. However, it did divide test scores into five broad categories:

• Level 1, Did not yet meet expectations (a minimum score of 650);
• Level 2: Partially met expectations (a minimum score of 700);
• Level 3: Approached expectations (a minimum score of 725);
• Level 4: Met expectations (a minimum score of 750); and
• Level 5: Exceeded expectations (a minimum score of 810).
• A perfect score is 850.

Illinois defines a passing score as either a Level 4 or Level 5; Ohio defines passing as a Level 3 (although Ohio uses different terminology, calling a Level 3 score “proficient.”)

My dictionary defines “proficient” as having an advanced degree of competence; an expert. Yet students scoring “proficient”in Ohio do not know what is expected in their grade level.  Semantics?

Before the test scores were announced, Ohio dropped out of future PARCC testing.

Other states which used the PARCC tests have announced results using the PARCC categories. Their results, as hyperlinked from a PARCC website, follow. Not all states reveal complete data to make a one-on-one comparison. Some states had large numbers of students not take the tests. Direct comparisons are difficult to make considering all the variables. All scores shown below are for language arts only.

  • Arkansas’ results show a range of 54 to 66 percent of students scoring at a Level 3, 4 or 5; and a range of 29 to 35 percent of students scoring at a Level 4 or 5 in grades 3 to 8.
  • Colorado’s results show 37 to 41 percent of students in grades 3 to high school met or exceeded” expectations. Definitions of “met or exceeded” were not provided.
  • Louisiana’s results show a range of 64 to 74 percent of students scoring at a level 3, 4 or 5; and 33 to 40 percent scoring at a Level 4 or 5.
  • Maryland’s results for 11th graders show 39.7 percent scoring at a Level 4 or 5.
  • Massachusetts’ results show 60 percent of students in grades 3 to 8 scoring at Level 4 or 5; and 83% scoring at a Level 3 or better.
  • Mississippi’s results show 49.4 percent of high school students scoring at a Level 4 or 5; and 72.7 percent scoring at a Level 3 or better.
  • New Jersey’s results of grade 3 through high school students show 40 to 52 percent scoring at a Level 4 or 5 while another 20 to 28 scored at a Level 3.
  • New Mexico does not list statewide results as a whole. But for grade 3, 24.9 percent scored at a Level 4 or 5; and another 23.6 scored at a Level 3.
  • Rhode Island’s results of grades 3 through high school show 31.4 to 38.3 of students scoring at Levels 4 and 5; and another 19.3 to 30.5 scoring at Level 3.
  • Washington, D.C.’s results show 25 percent of high school students scored at Level 4 or 5; and 37 percent scored at level 3, 4 or 5.

What does it all mean? If your child took a PARCC test, check the numerical score and ignore the terminology. If the score is 750 or better, relax.  If it’s lower, recognize that your child’s education is not up to snuff, according to the new Common Core State Standards. Then decide what you are going to do about it.


“No Child Left Behind” overturned by Senate; Obama expected to sign

The US Senate voted on December 9 to overturn “No Child Left Behind” and to replace the 2003 law with  a law which puts more control of education in the hands of the states and local school districts.  The House already passed the same bill.



President Obama is expected to sign the bill into law before the year ends.

“No Child Left Behind” gave the federal government more power over education at the expense of the states.  Many people view that law as mandating more testing and as leading to the Common Core curriculum, both opposed by more and more parents.

Under the new law, states would set their own standards and decide how much testing is necessary.  States would still need to test students in math and English/language arts, to publish those results, and to help failing students and schools.

“No Child Left Behind” law overturned by House; Senate expected to follow

sstudent filling in dots for testThe 2002 law that increased the US government’s role in education and mandated testing has been scrapped by the House of Representatives. If its replacement passes the Senate, as expected, “No Child Left Behind” (NCLB) will be replaced by a law which greatly reduces the federal government’s part in education and reduces the amount of tests US kids take. President Obama has indicated he would sign such an act.

This expected change in law does nothing to interfere with the Common Core State Standards which states can adopt or not as they see fit.  However, NCLB gave rise to the Common Core, and opposition to the Common Core and all its difficult testing has, in part, led to the end of “No Child Left Behind.”

Changes the new law would make include:

  • Schools would be less accountable to the federal government and more accountable to states and local school districts which would write their own standards for schools, students and teachers.
  • The US Education Department’s role would be reduced.
  • Students in public schools would need to be tested annually in math and English/language arts, and those scores would need to be published.
  • Schools would need programs to help low achieving students and schools.
  • Title 1 money for poor schools would continue.


Soul searching in NY: Will common sense win?

In New York, the percentage of students in grades 3 to 8 who passed the statewide “Common Core” English test in 2015 was 31.3%, about the same as in other states. In math, it was 38.1%, a little better.

However, in New York 220,000 students “opted out” of taking the test in May. 220,000 students is about 20% of all students who should have taken the English exams. That is the highest percentage of students in any state refusing to take the Common Core-aligned tests.  In some school districts the opt-out students were as high as 60% of the total. With so many parents voting against these tests by stilling their children’s number 2 pencils, what is next in New York for the controversial tests?

  • The governor of New York, who has been a firm backer of the tests, is about to announce that teachers’ job evaluations will not depend on test results.
  • Scores needed to “pass” these tests might be lowered. Before the Common Core-aligned tests began in NY in 2013, students needed a grade of 65% to pass high school English courses. The grade has been raised to 79% for students taking the test in 2022 (originally in 2017) but education officials are rethinking that.
  • Teaching-for-the-test strategies might change. One New York City school has eliminated music and art for its algebra students so that there can be two teachers in algebra classes, one to teach the whole group and another to teach pockets of students needing more help. That school’s test results are higher than other schools’ results. Other schools are thinking of hiring specialist teachers for English and math in elementary grades rather than having one teacher teach all subjects.

US literacy rank among other high ranking countries.It is unlikely that the tests will be scrapped completely despite nationwide opposition to the tests. Politicians and educators know US students are not up to snuff when it comes to language and math skills. International tests results on the latest PISA reading tests (Program for International Student Assessment) show the US ranks 24th out of 65 countries, with Asian countries dominating the high scores.

Stay tuned.

Georgia Common Core test results for individual schools and counties announced

Georgia released school by school and county by county Common Core test results on November 16.

sstudent filling in dots for testResults show what everyone expected:  in most schools students scoring below grade level outnumbered students scoring at or above grade level. In third grade, for example, 60.1% of students performed below grade level in English; in fourth grade, 62.9% of students scored below grade level; and in fifth grade, 60.8% of students scored below grade level.

But a closer look at particular schools shows a bleaker picture. Many schools exceeded 60% of students scoring below grade level. Several third grades around Georgia had more than 90% of students scoring below grade level. But a few schools had 70% score at or above grade level. The biggest discrepancy I noticed in a quick review of third grade raw numbers was one school that had not quite 3% score at or above grade level, and another school which had a little over 80% score at or above grade level.

Counties also showed wide discrepancies.  Warren County, a rural county with a 27% poverty rate and a small number of college graduates, showed a mean score of 442 for third graders.  Meanwhile Forsyth County, the wealthiest county in Georgia with many college graduates and advanced degree holders, showed a mean score of 544.  That is more than 100 points higher than Warren County.   Atlanta’s mean score for third graders was 493.

What does it all mean?

  • Some schools are doing a much better job preparing their students for the kind of learning tested on the Milestone tests.
  • Where a child goes to school matters.

Milestone test results cannot be compared to the results of students in other states because Georgia created its own tests which were given only to Georgia students. This thwarts one of the purposes of the Common Core, which is to allow a comparison of educational achievement by students all over the country.

In the future, whether a child passes to the next grade will be influenced by these test scores. For tests taken in the spring of 2016, Georgia promises test results to be made known sooner.

If you are a parent or a teacher of a Georgia public school student, you can see how your child’s school did and how your county did on these tests by grade level by going to the links below.  The data displayed is raw numbers and they are inconvenient to read.  For example, column headings are not repeated throughout the data so you must scroll up and down to find out what the numbers refer to.  Also, everything is single-spaced, and there are so many columns that they do not all fit on most computer screens at the same time.  School districts smaller than a county, such as city school districts, are listed at the end of the tables.

Spring 2015 Milestone tests grades 3 to 8 by school
Spring 2015 Milestone tests grades 3 to 8 by county
Spring 2015 Milestone tests grades 9 to 12 by school
Spring 2015 Milestone tests grades 9 to 12 by county

Comparing Common Core ELA test results in different states is like comparing apples, oranges, pears, tomatoes, kiwi, quince, cherries, raspberries. . .

One of the professed goals of the Common Core curriculum is that test results of students in one state, such as Georgia, can be compared to the test results of students in another state, say Connecticut.

table showing ELA test results in GA by grade

Click on table to enlarge it.

But the truth is, it can’t be done. Students all over the country took more than a dozen different tests this past spring. Fourteen states’ students took the same test, so their results can be compared. So can test results for students from another seven states who took the same tests. But then there are states like New York and Georgia which wrote their own tests. Student scores from those states can’t be compared to any other state’s student scores. And maybe that is the point, considering the horrible test results dribbling forth.

The Smarter Balanced Assessment System created a test used by 14 states, mostly in the western US: Hawaii, Washington, Oregon, California, Idaho, Nevada, Montana, North Dakota, South Dakota, Missouri, West Virginia, Vermont, New Hampshire and Connecticut.  Here are the results issued to date.

  • California has delayed the release of its test results until September 9, even though, by law, results should have been published in August. California has also removed from its Department of Education website data of past years’ test results in ELA and math, the two subjects tested with new tests this past spring, so that comparisons between the old and new test scores cannot easily be made. Insiders say the new test results are dismal. The state blames the delay on revealing specific results on the launch of a new web site to showcase the data. California says it wants to ensure that the test results are accurate.
  • New Hampshire will not release results until November, saying that since many students took the test with pencil and paper, not computer, it takes longer to score and record the results. New Hampshire has announced that its high school juniors will not take the Common Core test in the future, but will instead take an SAT exam.
  • Connecticut, which released its results at the end of August, shows that 55.4 percent of students in all grades passed the ELA test. About four percent of the state’s students boycotted the test.
  • In Missouri, 59.7 percent of students passed the ELA test; however, minorities and low income students scored 13% lower scores than the rest of the students. However, in June the state legislature banned future use of the test in Missouri.
  • In Oregon, 47 percent of third graders passed the ELA test.
  • West Virginia’s Department of Education said the majority of its students scored less than 50% on the tests, except for fifth graders who scored 51%. Third graders scored a 46% proficiency rate.
  • 48 to 61 percent of Idaho students passed the Idaho ELA test. Complete results will be out in October.
  • 53 to 62 percent of Washington State’s students passed the ELA test.

The Partnership for Assessment of Readiness and Careers created a different test used by seven states: Colorado, New Mexico, Illinois, Louisiana, Maryland, New Jersey, and Massachusetts. Washington, D.C.’s students also used the test. No test results have been issued.

Twenty-three other states, including Georgia, made their own tests or decided to leave the Common Core.

  • In Georgia, 36 % of test takers passed the ELA tests written by Georgia.  On September 3, Georgia announced that of the third graders who took the ELA test, 36 percent passed. 26 percent did okay while 10 percent did better than okay. Almost 2/3 of students failed.
  • In New York, 31.3 percent of test takers passed the ELA tests written by that state. Ten percent of eligible students opted out of taking the tests, skewing the results. Gov. Andrew Cuomo, a defender of the Common Core Standards, said today he is forming a commission to revamp education in NY.
  • Arizona inaugurated its own test this past spring, but results will not be announced until October.

And five states have not taken part in the Common Core: Texas, Nebraska, Virginia, Indiana and Alaska.

Was Georgia’s test harder than New York’s or West Virginia’s? No one knows. It was a lot harder than the Criterion Referenced Competency Test (CRCT) that was given in Georgia in past years, according to the state superintendent of education, Richard Woods. In a statement, he said, “Our previous assessment, the CRCT, set some of the lowest expectations for student proficiency in the nation, and that cannot continue. Georgia Milestones sets higher standards for our students and evens the playing field with the rest of the nation.”

We will have to take Mr. Woods’ word that the new Milestones test “evens the playing field with the rest of the nation” since the Georgia test was given only in Georgia, and like apples and raspberries, cannot be compared to test results in other states.

A breakdown of Georgia results will be released by the Georgia Department of Education in October, as will the results of many other states.

Teach children to predict, an important reading skill

Predicting means anticipating ahead of time what might happen in a story. As adults, we do it all the time. We read a murder mystery and we predict “who done it.” We read a romance and predict how the couple will get together. We read a thriller and predict if the characters will escape.

Predicting is more than making a guess. It is using what we already know and applying it to a new situation. When children predict, they make a connection between what they know and what they don’t know yet. They increase the likelihood that they will comprehend what they read. Wild guesses are not predictions.

Predicting from what we know to what we don't know graphic

Predicting focuses little children on what they are about to read. By looking at pictures, titles, subtitles, charts, photos, cartoons and other graphics, they grasp an idea about a story. Predicting attracts the child to a story. She wants to know if her prediction is correct. Predicting forces children to use visual or word clues to create meaning.

The Common Core State Standards include predicting in the reading standards.

However, predicting does not come naturally to all children. Children with dyslexia might be able to predict in a real life situation when there is no reading involved, but because they struggle deciphering the phonics code, they lose track of the meaning. Some children with dyslexia also have trouble sequencing. If so, predicting what will happen next is difficult.

Autistic children may also have trouble predicting since they have trouble interpreting social clues. The text might say that a character froze and was unable to talk, but the child might not know that the character is scared. How then can he predict what will happen next?

Here is a method of predicting that can be used with children of all ages. It combines vocabulary with predicting.

  • Go through a picture book or reading selection before the student reads it. Write down a dozen or more vocabulary words important to understanding the meaning of the text. Choose words which the child is likely to already know plus one or two new words.
  •  Write or type the words clearly on a paper, and then cut apart the words. Have one set of words for each pair of children if children are working in pairs. Put the words in plastic sandwich bags.
  • Explain to the child that he will be predicting what a story is about. He will be acting like a detective by using word clues.
  • Let the child pull out one word from the bag, read it aloud it and tell you what it means. If the child can’t read yet, tell him what the word says. If he doesn’t know the meaning, explain it to him. Lay the word on the desk or table in front of the child.
  •  Ask him what he thinks the story will be about based on that one word. Accept his answer.
  • Let the child pull a second word, repeating the previous two steps. Continue until all the words are read aloud. Encourage the child to change his mind about the prediction, or to become more convinced with each word.
  • Now ask the child to sort the words into categories or groups. (This step might be too advanced for some preschoolers.) Again, ask what he thinks the text might be about. Accept all answers, but gently steer the child into a prediction related to the text.
  • Now read the text. As you or the child read, note words the child pulled from the bag. Ask if the child still thinks his prediction is correct, or if he has changed his mind.
  • When the reading selection is complete, remind the child of his prediction and ask if he was correct.  Look at the words again.  Talk about what words helped and what words didn’t.  Ask what other words might have made the prediction closer to the truth.