Statistics education

[1][2] In the text rising from the 2008 joint conference of the International Commission on Mathematical Instruction and the International Association of Statistics Educators, editors Carmen Batanero, Gail Burrill, and Chris Reading (Universidad de Granada, Spain, Michigan State University, USA, and University of New England, Australia, respectively) note worldwide trends in curricula which reflect data-oriented goals.

A first attempt to define and distinguish between these three terms appears in the ARTIST website[4] which was created by Garfield, delMas and Chance and has since been included in several publications.

For example, Utts (2003) published seven areas of what every educated citizen should know, including understanding that "variability is normal" and how "coincidences… are not uncommon because there are so many possibilities.

Some instruments have been developed to measure college students' attitudes towards statistics, and have been shown to have appropriate psychometric properties.

Rejecting the contrived, and now unnecessary due to computer power, approach of reasoning under the null and the restrictions of normal theory, they use comparative box plots and bootstrap to introduce concepts of sampling variability and inference.

[26] The UK's Office for National Statistics has a webpage[27] leading to material suitable for both teachers and students at school level.

In 2004 the Smith inquiry made the following statement: "There is much concern and debate about the positioning of Statistics and Data Handling within the current mathematics GCSE, where it occupies some 25 per cent of the timetable allocation.

On the one hand, there is widespread agreement that the Key Stage 4 curriculum is over-crowded and that the introduction of Statistics and Data Handling may have been at the expense of time needed for practising and acquiring fluency in core mathematical manipulations.

On the other hand, there is overwhelming recognition, shared by the Inquiry, of the vital importance of Statistics and Data Handling skills both for a number of other academic disciplines and in the workplace.

Estonia is piloting a new statistics curriculum developed by the Computer-Based Math foundation based around its principles of using computers as the primary tool of education.

[citation needed] A difficulty of recruiting strong undergraduates has been noted: "Very few undergraduates positively choose to study statistics degrees; most choose some statistics options within a mathematics programme, often to avoid the advanced pure and applied mathematics courses.

[nb 1] As undergraduates, future statisticians should have completed courses in multivariate calculus, linear algebra, computer programming, and a year of calculus-based probability and statistics.

The ASA recommends that undergraduate students consider obtaining a bachelor's degree in applied mathematics as preparation for entering a master program in statistics.

[nb 3] Professional competence requires a background in mathematics—including at least multivariate calculus, linear algebra, and a year of calculus-based probability and statistics.

The principle that college-instructors should have qualifications and engagement with their academic discipline has long been violated in United States colleges and universities, according to generations of statisticians.

Data on the teaching of statistics in the United States has been collected on behalf of the Conference Board of the Mathematical Sciences (CBMS).

The principle that statistics-instructors should have statistical competence has been affirmed by the guidelines of the Mathematical Association of America, which has been endorsed by the ASA.

Second, statistical theory has often been taught as a mathematical theory rather than as the practical logic of science --- as the science that "puts chance to work" in Rao's phrase--- and this has entailed an emphasis on formal and manipulative training, such as solving combinatorial problems involving red and green jelly beans.

Statisticians have complained that mathematicians are prone to over-emphasize mathematical manipulations and probability theory and under-emphasize questions of experimentation, survey methodology, exploratory data analysis, and statistical inference.

[48][clarification needed] In recent decades, there has been an increased emphasis on data analysis and scientific inquiry in statistics education.

In an influential talk at USCOTS, researcher George Cobb presented an innovative approach to teaching statistics that put simulation, randomization, and bootstrapping techniques at the core of the college-level introductory course, in place of traditional content such as probability theory and the t-test.

[65] Several teachers and curriculum developers have been exploring ways to introduce simulation, randomization, and bootstrapping as teaching tools for the secondary and postsecondary levels.

Courses such as the University of Minnesota's CATALST,[66] Nathan Tintle and collaborators' Introduction to Statistical Investigations,[67] and the Lock team's Unlocking the Power of Data,[68] are curriculum projects based on Cobb's ideas.

Other researchers have been exploring the development of informal inferential reasoning as a way to use these methods to build a better understanding of statistical inference.

[69] [70][71] Another recent direction is addressing the big data sets that are increasingly affecting or being contributed to in our daily lives.

[73] Some researchers argue that as the use of modeling and simulation increase, and as data sets become larger and more complex, students will need better and more technical computing skills.