Predicting success, excellence and retention from early course performance: the suprisingly critical significance of formal academic writing competency

Students entering a first-year tertiary course have yet to be inculcated into to the challenging requirements of contemporary Western tertiary educational approaches, the specific performance requirements of their learning institution and their qualification program, and the
idiosyncrasies of the course and its teacher. However, success in first-year courses is crucial for building students’ capability. Perhaps the first year experience is more important for building students’ confidence for their future progress through their qualification? Consequently, I assert that it is prudent to provide performance feedback to students at the earliest possible stages of their tertiary academic career. Specifically, feedback on their performance and the likely success of their engagement in tertiary studies should they pursue their studies in their ‘business as usual’ fashion.

Statistical regression and correlation analysis of student grade results were used to explore the  relationship between early semester and late semester performance in a first year course of tertiary study. The analysis revealed it is possible to identify students  who are likely to be strong or weak academic performers based on the grade resulting from submission of students’ draft work for a Case Study assignment submitted in week three of the 12-week course. Specifically, the assessment of this submission was based solely on using a generic six trait method for evaluating writing quality. This information provides the teacher with the ability to provide specific advice to the student about their prospective future grade and why that grade is likely to ensue. Furthermore, the teacher can prescribe options the student can pursue to improve their performance superior to that predestined
apparently from the statistical prediction of their result. In general, as a minimum, the weak students need guidance to improve their formal writing skill.

Curiously, formal writing skill was required to be demonstrated for just 30 per cent of the overall course. Nevertheless, students demonstrating strong writing skill in Assignment 1 demonstrated  higher performance in several subsequent assignment components including a complex group project,  the construction of a Professional Learning Agenda, and the final multi-choice test. These
tasks, whilst not requiring formal writing skills, do require interpersonal oral communication skills, good comprehension of written course study materials, and good self-study skills.

Future investigation
My first investigation used relatively straight forward statistical regression and correlation analysis of student grade results to explore the relationship between early semester and late semester performance. Future investigations are exploring the extent to which data mining/machine learning can extract more subtle nuances from the data. Comparisons will be made between the insights gained, and the practicalities of using a data mining approach by an intelligent teacher without specialised training in data mining.


Views: 333

Replies to This Discussion

Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 1: Statistical analysis

Abstract
Students entering a first-year tertiary course have yet to be inculcated into to the challenging requirements of contemporary Western tertiary educational approaches, the specific performance requirements of their learning institution and their qualification program, and the idiosyncrasies of the course and its teacher. However, success in first-year courses is crucial for building students’ capability. Perhaps the first year experience is more important for building students’ confidence for their future progress through their qualification. Consequently, I assert that it is prudent to provide performance feedback to students at the earliest possible stages of their tertiary academic career. Specifically, feedback on their performance and the likely success of their engagement in tertiary studies should they pursue their studies in their ‘business as usual’ fashion.


This study analyses data on students’ performance collected through a 12-week semester. Unremarkably, the data, analysed by traditional statistical regression and correlation analysis shows that it is possible to forecast with good precision students’ overall course grade from formative assessments conducted within the first three weeks of the semester. Armed with this information, future students - under wise guidance from their teachers - should be able to overcome their apparently predestined grade through undertaking specific and focussed intervention through their course of study.


The purpose of this study, however, is intended to extend beyond the use of rudimentary statistical procedures in predicting students’ future success from early, quasi-formative assessment results. Specifically, my quest is to explore how the more recent field of machine learning/data mining might provide more accurate and/or earlier precision about the ultimate performance of a specific student in a class. Furthermore, can the ‘knowledge’ extracted from this data mining exercise provide a student and teacher with greater clarity about the academic advice offered a student?

Introduction
Last week, I continued to explore my new world of of data mining and machine learning. As part of my self-learning, I’ve begun a mini-project to compare and contrast my traditional statistical approaches to exploring data with my first attempts at using machine learning. Specifically, my objective is to explore the extent to which I can predict a student’s final grades in my courses from grades they gain early in the class. The earlier I can identify the likely best and weakest performers, then the earlier I can implement interventions to help students overcome the inevitable, should the prognosis be a poor grade. Furthermore, perhaps I could ‘buddy up’ the potentially high achieving students with weaker performers in the hope that the weaker performers might learn some relevant tricks of the trade from the stronger performers through a process of cascade learning or osmosis. I’ve specifically chosen to learn about data mining/machine learning using this type of data set, as I have regularly conducted semester reviews of the data from courses that I teach.

The data set
My data set contains about 50 instances. Each instance relates to the grades from one student. There are 16 data items for each instance (student), pertaining to the assessment items completed by the student over the semester. As is usual in an academic course, there is a strict arithmetic relationship between the grades and the final mark. The final mark is a weighted sum of the contribution from each of the assignment sub-components.

The assessment regime for the course comprises the elements in Table 1.

Table 1: Assessment components for BSNS 5391 Innovation & Entrepreneurship


Assignment 1 Case Study Analysis, 15 per cent comprising:
1a Case Study (In-progress) 3%
1b Case Study (Final) 11.4 %
1c Writing Quality Assessment (Final) 0.6%

2 Group Project, 40 per cent comprising:
2a Workshop Presentation 20%
2b Multi-Media Resource 10%
2c Test 10%

3 Professional Learning Agenda (PLA ) 25 per cent, comprising:
3a a Strengths Quest Assessment 6.25
3b PLA (In Progress) 6.25%
3c PLA and reflective essay (Final) 12.5%

4 Test, 20 per cent comprising:
4a Multi-Choice questions 10%
4b Short essays 10%


The final grade is calculated as follows:

Final grade = 0.15 x Ass 1 + 0.40 x Ass 2 + 0.25 x Ass 3 + 0.20 x Ass 4

Further details about the course BSNS 5391 are here:

The teaching strategy and philosophy:

The Course syllabus

Course structure and assessment strategy
Two pieces of assessment are completed within the first three weeks of the 15-week course, Assignment 1a and 3a. These two assessments are quasi-formative in the sense that they:

* Contribute a very small weighting of marks to the final grade
* Assess a component of what the student develops over future weeks into a more substantial submission.

For example, in one of the submissions, Assignment 1a, the student presents draft answers to Part One of a three part Case Study assignment. The teacher provides written feedback on both the ideas submitted (content), and the quality of writing. However, the grade for Assessment 1a is assessed by a rubric that relates solely to the quality of writing. The rubric utilises the ‘Six-trait method for evaluating writing quality’. The traits include equal weighting for: ideas and content, word choice, grammatical conventions, organisation, voice (personality), and sentence fluency (Course Syllabus, Mellalieu 2010, pp. 23-24, modified from Norton, modified from Maryvale Elementary, Mobile, TX).

When students submit their complete Assignment 1 (classified as Assignment 1b), they are required (permitted) to rewrite Assignment 1a to accommodate the feedback provided by the teacher. Appropriate writing quality earns the student little credit for submitted course work: just 5 per cent of the weight of Assignment 1c. In effect, this is just 0.6% of the entire course weighting. However, their is ‘devil in the detail’. Appropriate professional writing standards are expected as an absolute requirement for the course. If a student submits an assignment with a writing quality of less than 24/30 on the six-trait rubric mentioned earlier then two MAJOR consequences arise for the student. First, the student receives NO credit for the assignment until the assignment is resubmitted to an acceptable wring quality. Second, the original assessment grade stands: there is no extra credit gained for the repeat submission. This policy is derived from Haswell’s minimal Marking Policy (Haswell, 1983).

The second “staging point” in students’ progress through the course occurs mid-way through the semester. At this point, the students submit two further pieces of assessment, progress work - their Professional Learning Agenda (Assignment 3b), and their final submission of Assignment 1 (Assignment 1b, with 1c being the writing quality component of that assignment)

Over the remaining six weeks of the semester the students present the results of their Group Projects. Their study culminates in their Final Test. Some students choose not to sit the Final Test. Their reason is that they may be international study abroad students who need only gain a pass grade in the course. They prefer to allocate their study time to another course … or take early leave for a tour of the delights of Middle Earth/New Zealand. Consequently, the data set contains missing value for these students Final Test results.

Statistical analysis
My usual approach to analysing student grades at the conclusion of a course is to construct scatter plots between the various assignment elements. I add a trend line and calculate the correlation coefficient. These results are presented in the Figures.

By inspection, several observations are salient. Figures may be viewed here:

* A high Final course grade STRONGLY associated with high grade in first assignment R² = 0.47. Note the Final Grade includes a weighting of 15 per cent based on the Assignment 1 grade, so one expects a correlation. (Figure 1)
* A high Assignment 1 grade (1 b) is STRONGLY associated with high writing quality in the final submission, 1c. R² = 0.54. Note This correlation excludes the writing quality component. i.e. the plot is Assignment 1b versus 1c. (Figure 2)
* A high writing quality in the final version of Assignment 1c is STRONGLY associated with the writing quality of the trial draft (1a). R² = 0.45 (Figure 3)
* A high Final test grade is moderately associated with high Assignment 3 grade. R² = 0.26 (Figure 4)
* A high Final course grade is moderately associated with the writing quality of the trial draft assignment (R² = 0.24 (Figure 5)
* A high Final test grade is slightly related to Assignment 1 (total) mark. R² = 0.15. (Figure 6)
* A high Multi-choice grade in the Final Test is slightly associated with high Assignment 1 grade and R² = 0.15. (Figure 7)
* A high grade for the Assignment 3 overall is slightly associated with a high grade for Assignment 1 overall. R² = 0.16. (Figure 8)
* More subtly, a high grade for the final part of Assignment 3 (i.e. 3c) is slightly associated with a high grade for the final submission for Assignment 1 (i.e. 1b). R² = 0.13 (Figure 9)
* A high grade for the Assignment 1 is slightly associated with a high grade for Assignment 2. R² = 0.15. (Figure 10)

No association were found in these situations:

* Assignment 1 writing quality (1c) and Final Test essay grade. R² = 0.04. (THIS IS A SURPRISE!) (Figure 11)
* Group Project (Assignment 2) and Assignment 3. R² = 0.01. (Figure 12)
* Group Project (Ass 2) and Final Test R² = 0.00. (Figure 13)
* First and last submission components for Assignment 3 (i.e. 3a and 3c respectively) R² = 0.02 (Figure 14)

Assignment 2 is a group project, so one would not expect a strong association at all between the two assignments, particularly since I allocated the students to their teams.

Discussion
A high grade in the course BSNS 5391 is associated STRONGLY with students who gain high grades in the Case Study, Assignment 1 (R² = 0.47). This is despite the fact that the weighting of the Case Study assignment is just 15% of the overall assessment weighting for the course. Students’ grades from the Case Study assignment, therefore, are a strong predictor of general academic performance in three other quite different academic assessments: the Group Project (Assignment 2), the Professional Learning Agenda (Assignment 3), and the Final Test (Assignment 4).

Given this key finding, how early in the course can we predict students’ likely performance through the remainder of the course: strong or weak? The statistical analysis shows that once the teacher has assessed the submission of students’ draft Case Study assignment 1a submitted by week 3, the students are predestined towards a high or low grade in the course overall through the following chain of events:

High formal writing quality evidenced in a students draft is STRONGLY likely to be evidenced in the writing quality of their final Assignment 1 submission. Furthermore, that high writing quality contributes strongly - but indirectly - to a high grade for the assignment overall. I state’ indirectly’ because the weighting attributed to writing quality for the assignment is just 3.6 % for the course overall, and just 0.6 % for the final submission. From a grade contribution point of view, students’ motivation to write adequately is not strong - apart from the teacher’s implementation of the fiendish Haswell Minimal Marking strategy that requires students to resubmit if their writing does not meet the professional standard of 24/30 on the six-trait writing evaluation rubric mentioned earlier.

Secondly, a high grade for Assignment 1 is slightly associated (R² = 0.15) with a high grade for the subsequently-submitted Group Project, Assignment 2. This is curious since the teacher attempted to create seven-person groups through a stratified, uniformly random scheme. However, some subterranean reconfiguration of group membership became apparent in several instances. An All-Chinese team emerged which was certainly not how I had configured the initial groups. Secondly, the large teams of seven were permitted to bifurcate into two smaller teams by mutual negotiation.

Thirdly, a high grade for Assignment 1 is slightly associated (R² = 0.16) with a high grade for the subsequently-submitted Assignment 3.

Finally, a high grade for Assignment 1 is associated slightly (R² = 0.15) with high grades for the Final test, specifically the multi-choice component of the Final Test. The latter point is somewhat surprising, as the BSNS 5391 multi-choice test depended primarily on reading and course material recall, rather than the careful reading, demanding analysis, and professional report-style writing required in the Case Study assessment.

A curiosity remains that their was NO correlation between writing quality of students first assignment (1c), and the grade of the short (five paragraph essays) written in the final test.


Predicating a students specific final grade

Overall, the regression analysis (Figure 5) means that we can use the following equation to predict a student’s overall course grade from the writing quality of their first draft assessment, Assignment 1a as follows:

final = 0.86 q + 61

where:
q = writing quality on a scale 1… 30 assessed by the six-trait writing evaluation rubric
final = the course grade on a scale 0 … 100.

Example: A student gains a writing quality, q, of 20/30. Their likely overall course grade will be: 0.86 x 20 + 61 = 17 + 61 = 78. However, given the moderate correlation coefficient, R² = 0.24, there is still plenty of opportunity for the student to do better - or worse - than this result!

Once the student has submitted All of Assignment 1, their overall course grade can be predicted with greater precision from the equation:

final = 0.29 assignment1 + 57

where:
assignment 1 = the grade for assignment 1 (all components), on a scale 0… 100.

Example: A student gains a grade of 71 in Assignment 1. Their likely overall course grade will be: 0.29 x 71 + 57 = 21 + 57 = 78. However, given the higher correlation coefficient, R² = 0.47, the student will now need to work very diligently in their remaining assignments to gain a superior mark. They are now almost on a predestined path … unless they are lazy and fail the remaining assignments.

Summary and conclusion
Statistical regression and correlation analysis of student grade results were used to explore the relationship between early semester and late semester performance in a first year course of tertiary study. The analysis revealed it is possible to identify students who are likely to be strong or weak academic performers based on the grade resulting from submission of students’ draft work for a Case Study assignment submitted in week three of the 12-week course. Specifically, the assessment of this submission was based solely on using a generic six trait method for evaluating writing quality. This information provides the teacher with the ability to provide specific advice to the student about their prospective future grade and why that grade is likely to ensue. Furthermore, the teacher can prescribe options the student can pursue to improve their performance superior to that predestined apparently from the statistical prediction of their result. In general, as a minimum, the weak students need guidance to improve their formal writing skill.

Curiously, formal writing skill was required to be demonstrated for just 30 per cent of the overall course. Nevertheless, students demonstrating strong writing skill in Assignment 1 demonstrated higher performance in several subsequent assignment components including a complex group project, the construction of a Professional Learning Agenda, and the final multi-choice test. These tasks, whilst not requiring formal writing skills, do require interpersonal oral communication skills, good comprehension of written course study materials, and good self-study skills.

Future investigation
This investigation used relatively straight forward statistical regression and correlation analysis of student grade results to explore the relationship between early semester and late semester performance. Future investigations will explore the extent to which data mining/machine learning can extract more subtle nuances from the data. Comparisons will be made between the insights gained, and the practicalities of using a data mining approach by an intelligent teacher without specialised training in data mining.

I could attempt more sophisticated statistical analysis using multiple regression and discriminant analysis. But now its time to explore machine learning. Probably a Naive Bayes classifier will be an early tool in my exploration.

Resource use
At this stage of the project, resources used as follows:

* Initial analysis of data: 4 hours
* Writing of draft report and additional analysis: 7 hours
* Preparation and publication on websites: 1 hour
* Total 10 hours

Statistical analysis conducted using NeoOffice 3.0 Patch 5.

References

Haswell, R. H. (1983). Minimal marking. College English, 45(6), 600-604. Retrieved from http://www.google.co.nz/url?sa=t&source=web&cd=1&ved=0C...

Mellalieu, P. J. (2009). Writing to learn argument and persuasion: A ‘Trojan Horse’ for promoting the adoption of ‘Writing Across the Curriculum’ (WAC) principles. Presented at the Unitec Teaching and Learning Symposium, Auckland, NZ: Unitec Institute of Technology. Retrieved from http://web.mac.com/petermellalieu/Teacher/Blog/Entries/2008/7/31_Wr...(WAC)_principles.html

Mellalieu, P. J. (2010, August 21). Course Handbook and Syllabus Unitec BSNS 5391 Innovation and Entrepreneurship. Scribd. Retrieved August 21, 2010, from http://www.scribd.com/doc/36191676/Course-Handbook-and-Syllabus-Uni...

Related articles

* No More A’s for Good Behavior (nytimes.com)
* The 3/4 Class (aphilosopher.wordpress.com)
* How to Prepare for the End of the Semester (psychology.about.com)
* Teaching and study methods at postgraduate level in the UK (postgrad.com)
* Study Says Internet Commenters Are a Bunch of Haters (daddymodern.com)
* Realistic Tips for Teachers: What They Don’t Teach You in College (socyberty.com)
* A Personal Perspective on Machine Learning (win-vector.com)
* A Taxonomy of Data Science (dataists.com)

First posted here
Mellalieu, P. J. (2010a, November 29). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education: Part 1: Statistical analysis - Figures. Innovation & chaos ... in search of optimality. Retrieved November 29, 2010, from http://pogus.tumblr.com/post/1723717009/predicting-success-excellen...

Mellalieu, P. J. (2010b, November 29). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 1: Statistical analysis. Innovation & chaos ... in search of optimality. Retrieved November 29, 2010, from http://pogus.tumblr.com/post/1724117822/predicting-success-excellen...
A Decision Support System for predicting success, excellence, and retention from students' early course performance: a machine learning approach in a tertiary education programme in innovation and entrepreneurship: Part 1: Project summary

A machine learning/data mining exercise using the WEKA Explorer workbench identified the feasibility of predicting accurately students' final course performance from formative and summative assessments conducted within the first three weeks of a 12-week semester (Mellalieu 2010 a,b,c,d,e; Witten & Frank 2005; Hall, Frank et al, 2009). This finding led to the decision of the course instructor to initiate construction of a prototype Decision Support System (DSS) to provide a student and their academic advisors the means to predict the student's personal academic success and final grade as they progressed through the course (Mellalieu, 1982).


The DSS is to be implemented as a spreadsheet. Inputs include the student's grade achieved on assignment components throughout the period of the course, demographic data, the student's five talents as identified by the Gallup StrengthsFinder 2.0 instrument, and other psychometric data. The prediction system underpinning the DSS is based on several rules and regression equations derived from a test data set of student results from a previous delivery of the course in 2010. A schematic overview of the inter-relationship between several pertinent factors is presented and explained in Mellalieu (2010 f).


Further figures are available here.

The course is a first year tertiary education course in innovation and entrepreneurship (BSNS 5391) that is compulsory for all students in the management and marketing majors of a Bachelor of Business programme. The DSS will output a student's estimated grade and class percentile ranking at the start of the course (Week 1), and updated following assessments submitted by the student in Week 3, Week 6 (mid-way), and Week 12. A 'last chance' final assessment occurs during the examination period, Week 14, the result for which can be predicted from all the student's previous assessment results... and possibly other demographic data. The DSS output will also include an measure of the precision of the estimate, such as the statistical standard error of the grade estimate. As the student progresses through the course, we expect the precision to improve, since there will be an increased number of data items upon which to make the prediction. Furthermore, the student will know their actual 'earned grade' contribution to the final course grade.


An enhanced version of the DSS could be developed to provide feedback about what steps the student can undertake to 'beat the system' in order to obtain higher grades than their projected grade. For example, a most potent indicator of overall course performance revealed by the data mining exercises identified the crucial importance of a student's ability to write formal academic English in response to a written case student assignment. This finding suggests that if a student undertakes personal coaching in the antecedents required to rite gude inglish [sic] then their chances of achieving an overall higher grade will increase. I suspect these antecedent competencies include: reading for comprehension, paraphrasing information, writing persuasive and/or logical arguments, selecting appropriate words to express thoughts, writing Global English sentence structures, paragraphing using topic sentences, and presenting/organising arguments and evidence in an appropriate genre such as a formal business report, short essay, or memo (Mellalieu 2007, 2008, 2010 g,h).

The foregoing are a selection of core, generic academic competencies that might well underpin success in many other business and tertiary education courses. Consequently, a DSS that provides students with 'early warning' of the likelihood of their academic success or failure based on an early assessment of these competencies may encourage students, their instructors, and advisors to take early, proactive action to remedy deficiencies both within and outside the classroom.

References
Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., & Witten, I. H. (2009). The WEKA Data Mining Software: An Update. SIGKDD Explorations, 11(1).
  
Witten, I. H., & Frank, E. (2005). Data mining: practical machine learning tools and techniques (2nd ed.). Morgan Kaufmann.
 
Author's supporting studies
Mellalieu, P. J. (1982). A Decision Support System for Corporate Planning in the New Zealand Dairy Industry (Doctor of Philosophy in mathematics, statistics and operations research). Victoria University of Wellington. Retrieved from http://hdl.handle.net/10063/568

Mellalieu, P. J. (2007, June 3). Let’s all learn and teach Global English in our business schools! Peter Mellalieu - Teacher. Retrieved April 21, 2010, from http://web.mac.com/petermellalieu/Teacher/Blog/Entries/2007/7/3_Let...!.html

Mellalieu, P. J. (2008). Writing to learn argument and persuasion: A 'Trojan Horse' for promoting the adoption of 'Writing Across the Curriculum' (WAC) principles (Working paper). Auckland, NZ: Unitec New Zealand Centre for Innovation & Entrepreneurship. Retrieved from http://web.mac.com/petermellalieu/Teacher/Blog/Entries/2008/7/31_Wr...(WAC)_principles.html

Mellalieu, P. J. (2010a, November 29). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 1: Statistical analysis. Innovation & chaos ... in search of optimality. Retrieved November 29, 2010, from http://pogus.tumblr.com/post/1724117822/predicting-success-excellen...

Mellalieu, P. J. (2010b, November 29). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education: Part 1: Statistical analysis - Figures. Innovation & chaos ... in search of optimality. Retrieved November 29, 2010, from http://pogus.tumblr.com/post/1723717009/predicting-success-excellen...

Mellalieu, P. J. (2010c, November 30). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 2a: Preparing for data mining. Innovation & chaos ... in search of optimality. Retrieved December 3, 2010, from http://pogus.tumblr.com/post/1983876991/predicting-success-excellen...

Mellalieu, P. J. (2010d, November 30). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 2a: Preparing for data mining - Figures. Innovation & chaos ... in search of optimality. Retrieved November 30, 2010, from http://pogus.tumblr.com/post/1983667421/predicting-success-excellen...

Mellalieu, P. J. (2010e, December 3). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 2b: Data mining with WEKA Explorer. Innovation & chaos ... in search of optimality. Retrieved December 3, 2010, from http://pogus.tumblr.com/post/2079452955/predicting-success-excellen...

Mellalieu, P. J. (2010f, December 5). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 3a: A comprehensive time-staged model - Figures. Innovation & chaos ... in search of optimality. Retrieved December 5, 2010, from http://pogus.tumblr.com/post/2100181658/predicting-success-excellen...

Mellalieu, P. J. (2010g, November 8). Preparing an essay for a written test or examination (Part A). Innovation & chaos ... in search of optimality. Retrieved November 8, 2010, from http://pogus.tumblr.com/post/1512039456/short-essay-advice

Mellalieu, P. J. (2010h, November 8). Preparing an essay for a written test or examination (Part B): The Pogorific A+ exam-sitting method. Innovation & chaos ... in search of optimality. Retrieved November 8, 2010, from http://pogus.tumblr.com/post/1512453524/scheduling-time-in-a-test

This article first posted on http://pogus.tumblr.com/
Assessement policy and generic six-trait formal writing evaluation rubric used in BSNS 5391 Innovation and Entrepreneurship
Attachments:
Course Handbook for BSNS 5391 Innovation and Entrepreneurship: Executive summary

Overview
This course handbook is the ‘blueprint’ detailing many elements of the course BSNS 5391 Innovation and entrepreneurship. The handbook introduces the course descriptor, assignments, and learning resources for the course. You are provided with specific guidance for preparing for the first in-class studio contact period and first assignment.

Purpose
The purpose of the handbook is to ensure that a student enrolling in the course is informed comprehensively about:
- The aims of the course
- The learning outcomes that students are required to demonstrate
- Reading, personal study, workload, and assignment schedules
- The learning resources required for perusal by student, including texts, lecture notes, on-line resources, and web-based learning support systems
- The assignments that help students focus and demonstrate their learning
- Agendas for the first few weeks of class, including materials relevant to those classes
- Specific activities that you should schedule immediately for starting course assignments and ‘learning adventure’.

Unique features
The course seeks to demonstrate what it preaches: innovation, entrepreneurship, creativity, continuous improvement, and shared learning informed by the principles of Unitec’s ‘Living Curricula’ policy. Consequently, you will find these features of the course less common elsewhere:
- Requirement to progress work on three assignments concurrently
- Requirement to write a regular personal journal (blog) culminating in a reflective essay and a strengths-based Professional Learning Agenda
- An engaging new venture team-based assignment in which you demonstrate create and lead a teaching and learning workshop for other class participants.
- A weekly opportunity at the start and end of each class to raise concerns and refine the tutor’s approach to leading your learning journey. Kai-zen.

Recommendations for immediate action
- Read the Introduction
-Skim read the Table of Contents and Course Descriptor
- Complete the preparation for the first class meeting as specified in: Agenda and Course Material for Studio 1.
- Note the navigation hyperlinks in the top left header (TOC) and bottom left footer (BSNS) that take the on-line reader directly to the Table of Contents, and Semester-Specific course calendar.
Attachments:

A mathematics lecture, apparently about linear...
Image via Wikipedia

This is a memo I recently submitted to my students based on my analysis of their performance in the first three weeks of their course with me....

Their first assignment in BSNS 5391 Innovation & Entrepreneurship introduces the students to the traditional business case analysis. I achieve this introduction through a series of comprehension-like questions that lead the students to write a Formal Business Report.

Students submit their answers to 1/4 of the questions for the assignment in Week 3 of the course (Assignment 1a). I provide formative feedback based SOLELY on their writing quality assessed against six criteria adapted from a rubric from Maryvale School. My Machine Learning analysis identified that students' performance on this draft assignment had extraordinary predictive power. Not only for the completed Assignment 1, but also for such tasks as an open book multi-choice test!

The memo to my students

Based on the mark for your Assignment 1a, I am able to predict what your grade will be for future assignments in this class. Table 3.3 shows the predictions.

If you do not like the predictions then take heart:

  • If you work smarter and harder than the 'average' student you may be able to gain a better mark than the prediction given your mark thus far.
  • The forecast error is (plus or minus) +/- 5 marks. For example, if you gained 15 marks for assignment 1a, then I predict you will gain 62 +/- 5 (57 to 67) marks for Assignment 1, and 73 +/- 4 (68 to 78) marks for the Final Mark for the course overall.
  • If you take advantage of the many resources within and beyond Unitec to improve your formal reading, comprehension, and writing skills, the benefits will extend through and beyond this course (Mellalieu, 2008).

These forecasts are NOT guarantees. The forecasts are based on statistical analysis of previous students grades for this class. This class appears to be quite different from the class from which I derived my forecast equations. So the error (plus or minus) may be higher. However, by the time I grade your Assignment 1bc and 3b, I will be able to present predictions to +/- 3 marks.

Table 3.3: Final Mark predicted from Formal Writing Quality (Assignment 1a) – (2011-1 Actual)

The following Table 3.4 shows Table 3.3 converted to Grades according to Unitec's grade equivalence system.

Is it possible to get an A+ as a Final Mark in this course? YES. You must achieve an A or A+ for your Assignment 2. The grade shown in Table 3.3 for Assignment 2 is the class average I anticipate (81, A-). Since Ass 2 is a group assignment, the grade for Assignment 2 is not directly dependant on the grade you get for Assignment 1. But see my comments below.

Table 3.4 Final Grade predicted from Formal Writing Quality (Assignment 1a) – (2011-1 Actual)



For those with the #StrengthsQuest talents of Competition and Achiever, you may like to know your performance compared to others in the class. Table 3.5 shows how many students are expected to grade below you given your mark for Assignment 1a. I calculate this data based on the mean and standard deviation of the class's results combined with my guestimates of future class performance.

Example: Given a student achieves a writing quality of 20 (Assignment 1a) then I predict that the student will achieve a grade better than 66.4 % of all other students in the class for Assignment 1. The table also suggests that only 23.7 per cent of students will achieve the required writing level of 24/30 for Assignment 1. (24 = 100 - 76.3).


Table 3.5: Student's relative performance predicted from Formal Writing Quality (Assignment 1a) – (2011-1 Actual)


 In a later posting, I'll show you how the following two factors predict the likely grade you will gain for Assignment 2:

  • Average writing quality of team members (as measured by Assignment 1a)
  • Average team level of entrepreneurship (as measured by the talents that Bolton & Thompson argue are required for a serial entrepreneur, namely those identified in their FACETS model)

The BSNS 5391 boffins welcome with eagerness your questions and comments about the data presented here!

Further information from our laboratories

Mellalieu, P. J. (2011, April 12). A Decision Support System for predicting Retention, eXcellence, & Success: The Dashboard for ReXS. Innovation & chaos ... in search of optimality. Retrieved April 12, 2011, from http://pogus.tumblr.com/post/4548163448/image-via-wikipedia-a-decis...

Mellalieu, P. J. (2011). Unitec BSNS 5391 Innovation and Entrepreneurship. Course Handbook and Syllabus. Auckland, NZ: Unitec New Zealand. Retrieved from http://www.scribd.com/doc/36191676/Course-Handbook-and-Syllabus-Uni...

Mellalieu, P. J. (2010a, November 29). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 1: Statistical analysis. Innovation & chaos ... in search of optimality. Retrieved November 29, 2010, from http://pogus.tumblr.com/post/1724117822/predicting-success-excellen...

Mellalieu, P. J. (2010b, November 29). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education: Part 1: Statistical analysis - Figures. Innovation & chaos ... in search of optimality. Retrieved November 29, 2010, from http://pogus.tumblr.com/post/1723717009/predicting-success-excellen...

Mellalieu, P. J. (2010c, November 30). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 2a: Preparing for data mining. Innovation & chaos ... in search of optimality. Retrieved December 3, 2010, from http://pogus.tumblr.com/post/1983876991/predicting-success-excellen...

Mellalieu, P. J. (2010d, November 30). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 2a: Preparing for data mining - Figures. Innovation & chaos ... in search of optimality. Retrieved November 30, 2010, from http://pogus.tumblr.com/post/1983667421/predicting-success-excellen...

Mellalieu, P. J. (2010f, December 3). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 2b: Data mining with WEKA Explorer. Innovation & chaos ... in search of optimality. Retrieved December 3, 2010, from http://pogus.tumblr.com/post/2079452955/predicting-success-excellen...

Mellalieu, P. J. (2010g, December 5). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 3a: A comprehensive time-staged model - Figures. Innovation & chaos ... in search of optimality. Retrieved December 5, 2010, from http://pogus.tumblr.com/post/2100181658/predicting-success-excellen...

Mellalieu, P. J. (2010h, December 6). A Decision Support System for predicting success, excellence, and retention from students’ early course performance: a machine learning approach in a tertiary education programme in innovation and entrepreneurship: Part 1: Project summary. Innovation & chaos ... in search of optimality. Retrieved January 10, 2011, from http://pogus.tumblr.com/post/2110849868/a-decision-support-system-f...

Mellalieu, P. J. (2008). Writing to learn argument and persuasion: A “Trojan Horse” for promoting the adoption of “Writing Across the Curriculum” (WAC) principles (Working paper). Auckland, NZ: Unitec New Zealand Centre for Innovation & Entrepreneurship. Retrieved from http://web.mac.com/petermellalieu/Teacher/Blog/Entries/2008/7/31_Wr...(WAC)_principles.html

Enhanced by Zemanta

 

 

 

A Decision Support System for predicting Retention, eXcellence, & Success: The Dashboard for ReXS:

Late November, I began a journey exploring how I could apply data mining/machine learning to help predict students' performance throughout their semester of studies. The intention is to gain information within the first few weeks of the semester and develop prognoses for their future performance.

Over the last few weeks I have been testing, calibrating and experimenting with a spreadsheet-based Decision Support System assembled on the relationships I identified from my analyses of last semester's student performance data.

The image shows the dashboard that a student uses to 'drive' the DSS. The cells in green show where the user enters data, including:

  • The grade they achieved for a specific assignment
  • The actual class mean and standard deviation for a specific assignment. If the data is not available, the statistics from a prediction dataset are used.

The DSS displays several outputs for the user:

  • Under MY GRADES (in orange): the per cent mark, letter grade, and weighted contribution to the final course grade - nothing special there!
  • Under CLASS PERFORMANCE (in blue): an estimate of the student's relative placing in the class, measured by the per cent of students below them in the class. This is derived simply from their mark and the class statistics (assuming a normal distribution of grades)
  • Under PREDICTED GRADES (Yellow): an estimate for the student's grades for all outstanding assignments in the course. This also includes a prediction of the student's overall relative placing in the class. This is the 'magic' part of the DSS.

In addition, the student can enter their actual grades and conduct 'what if' analyses to explore what grade for assignment components they need to achieve to gain their target grade for the course as a whole.

Tabs at the bottom of the DSS dashboard permit the user to explore the inner workings of the DSS. These inner workings include the prediction equations resulting from the machine learning data analysis: Predictors, Class stats, etc. The Babbage tab is a giant spreadsheet that shows all the calculations in full.

Project summary

Mellalieu, P. J. (2010, December 6). A Decision Support System for predicting success, excellence, and retention from students’ early course performance: a machine learning approach in a tertiary education programme in innovation and entrepreneurship: Part 1: Project summary. Innovation & chaos ... in search of optimality. Retrieved January 10, 2011, from http://pogus.tumblr.com/post/2110849868/a-decision-support-system-f...

Related information

Culver, T. (2010, January 21). Setting student retention goals and developing student retention strategy. Noel-Levitz. Retrieved January 23, 2011, from http://blog.noellevitz.com/2011/01/21/6-keys-setting-realistic-stud... Culver, T. (2011). Mid-Year Retention Indicators Report for Two-Year and Four-Year, Public and Private Institutions: Benchmark Research Study Conducted Fall 2010. Higher Ed Benchmarks. Noel-Levitz. Retrieved from https://www.noellevitz.com/NR/rdonlyres/C72E0608-2024-4175-9A94-966...

Enhanced by Zemanta

This video gives a demo of how to use the ReXS spreadsheet.


Using the ReXS Decision Support System for predicting success and e... from Peter Mellalieu on Vimeo.

A machine learning/data mining exercise using the WEKA Explorer workbench identified the feasibility of predicting accurately students’ final course performance from formative and summative assessments conducted within the first three weeks of a 12-week semester (Mellalieu 2010, 2011; Witten & Frank 2005; Hall, Frank et al, 2009). This finding led to the decision of the course instructor to initiate construction of a prototype Decision Support System (DSS) to provide a student and their academic advisors the means to predict the student’s personal academic success and final grade as they progressed through the course (Mellalieu, 1982).

The DSS is implemented as an .xls spreadsheet, known as ReXS (for Retention, eXcellence, and Success). Inputs include the student’s grade achieved on assignment components throughout the period of the course, demographic data, and other psychometric data. The prediction system underpinning the DSS is based on several rules and regression equations derived from a test data set of student results from a previous delivery of the course in 2010.

The course is a first year tertiary education course in innovation and entrepreneurship that is compulsory for all students in the management and marketing majors of a Bachelor of Business programme. The DSS outputs a student’s estimated grade and class percentile ranking at the start of the course (Week 1), and updated following assessments submitted by the student in Week 3, Week 6 (mid-way), and Week 12. A ‘last chance’ final assessment occurs during the examination period, Week 14, the result for which can be predicted from all the student’s previous assessment results. As the student progresses through the course, we expect the precision of the grade estimate improves, since their are increased number of data items upon which to make the prediction. Furthermore, the student will know their actual ‘earned grade’ contribution to the final course grade.

Using the ReXS .xls spreadsheet
The video explains use of the ReXS dashboard that a student uses to ‘drive’ the DSS. The cells in green show where the user enters data, including:

- The grade they achieved for a specific assignment

- The actual class mean and standard deviation for a specific assignment. If the data is not available, the statistics from a prediction dataset are used (for example, projected from previous semester’s results).

The DSS displays several outputs for the user:

- Under MY GRADES (in orange): the per cent mark, letter grade, and weighted contribution to the final course grade - nothing special there!

- Under CLASS PERFORMANCE (in blue): an estimate of the student’s relative placing in the class, measured by the per cent of students below them in the class. This is derived simply from their mark and the class statistics (assuming a normal distribution of grades)

- Under PREDICTED GRADES (Yellow): an estimate for the student’s grades for all outstanding assignments in the course. This also includes a prediction of the student’s overall relative placing in the class. This is the ‘magic’ part of the DSS.

Furthermore, the student can enter their actual grades and conduct ‘what if’ analyses to explore what grade for assignment components they need to achieve to gain their target grade for the course as a whole.

Tabs at the bottom of the DSS dashboard permit the user to explore the inner workings of the DSS. These inner workings include the prediction equations resulting from the machine learning data analysis: Predictors, Class stats, etc. The Babbage tab is a giant spreadsheet that shows all the calculations in full.

ReXS spreadsheet
Available for download here:

Mellalieu, P. J. (2011, April 26). ReXS: Decision Support System for Retention, eXcellence, and Success (.xls spreadsheet). Peter Mellalieu - Teacher. Retrieved April 26, 2011, from http://preview.tinyurl.com/rexsdss1-2

References
Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., & Witten, I. H. (2009). The WEKA Data Mining Software: An Update. SIGKDD Explorations, 11(1).

Mellalieu, P. J. (2011, April 27). Predicting success, excellence, and retention from students’ early course performance (video). Presented at the research seminar, Department of Management & Marketing, Unitec Institute of Technology, Auckland. Retrieved from http://vimeo.com/22877834


Mellalieu, P. J. (2011, April 18). Predicting success, excellence, and retention from students’ early course performance: progress results from a machine learning-based decision support system in a first year tertiary education programme (Research seminar announcement). Innovation & chaos ... in search of optimality. Retrieved April 17, 2011, from http://pogus.tumblr.com/post/4702098630/a-research-seminar-predicti...

Witten, I. H., & Frank, E. (2005). Data mining: practical machine learning tools and techniques (2nd ed.). Morgan Kaufmann.

Predicting success, excellence, and retention from students' early course performance: progress results from a machine learning-based decision support system in a first year tertiary education programme (slideshow)

 

 

Predicting success, excellence, and retention from students' early ... from Peter Mellalieu on Vimeo.

Predicting success, excellence, and retention from students' early course performance: progress results from a machine learning-based decision support system in a first year tertiary education programme

Keynote presentation from a research seminar

Wednesday 27 April, 2011, 12 noon
Location: Room 172-4024

Peter J Mellalieu, pmellalieu@unitec.ac.nz

Department of Management & Marketing
Faculty of Creative Industries and Business
Unitec Institute of Technology, Auckland, New Zealand

Higher educational institutions are focussing increased attention on identifying which students are likely to succeed - or fail - in their tertiary studies. Historically, academics have been keen to identify the ‘bright young things’ they view as prospects for recruiting to postgraduate courses. Less attention has been paid to those students who fall by the wayside: there have been plenty of ambitious and talented students to take their place. More recently, institutions have been obliged to pay attention to identifying those students ‘at risk’ of failure or marginal grades and conducting re-mediative activity. For instance, Culver (2010, 2011) reports on the services provided by the Noel-Levitz consultancy for improving institutional retention in North America. Culver argues that the business case for active management of student retention is compelling.

New Zealand higher educational institutions are beginning to recognise the need to follow the trend to manage student retention and success using approaches similar to those adopted by North American institutions. The driver for the New Zealand initiative is that government funding for higher education is increasingly being redirected towards a focus on outputs rather than inputs. Specifically, this entails a refocus of government funding towards funding student completions of academic programmes rather than simply student enrollments. It is noteworthy that in New Zealand, around one-half of higher education is funded by government funds.

Given this context, I constructed a prototype Decision Support System (DSS) that provides a student the means to predict their personal academic success and final grade as they progress through their course. The DSS is implemented as an interlocked series of spreadsheets, known as ReXS - for Retention, Excellence, and Success (Mellalieu, 2011a,b).

Inputs to ReXS include the student's grade achieved on assignment components throughout the period of the course, demographic data, the student's five talents as identified by the Gallup StrengthsFinder 2.0 instrument, and other psychometric data. The prediction system underpinning the DSS is based on several rules and regression equations derived from a test data set of student results from a previous delivery of the course in 2010. A schematic overview of the inter-relationship between several pertinent factors is presented in Mellalieu (2010 f). Figure 1 presents a snapshot of the dashboard user interface for the ReXS DSS.

A machine learning/data mining investigation using the WEKA Explorer workbench (Witten & Frank 2005; Hall, Frank et al, 2009) identified the feasibility of predicting accurately students' final course performance from formative and summative assessments conducted within the first three weeks of a 12-week semester. The prediction system underpinning ReXS was subsequently derived from a set of rules and regression equations determined using WEKA Explorer (Mellalieu 2010 a,b,c,d,e,f,g).

The course is a first year tertiary education course in innovation and entrepreneurship that is compulsory for all students in the management and marketing majors of a Bachelor of Business programme at Unitec Institute of Technology. The DSS outputs a student's estimated grade and class percentile ranking at the start of the course (Week 1). These statistics are updated following assessments submitted by the student in Week 3, Week 6 (mid-way), and Week 12. A 'last chance' final assessment occurs during the examination period, Week 14, the result for which can be predicted from all the student's previous assessment results... and other demographic data. The DSS output also includes a measure of the precision of the estimate, such as the statistical standard error of the grade estimate. As the student progresses through the course, the precision improves, since there are an increased number of data items upon which to make the prediction.



A most potent indicator of overall course performance revealed by the data mining exercises identified the crucial importance of a student's ability to write formal academic English in response to a written case study assignment. This finding suggests that if a student undertakes personal coaching in the antecedents required to rite gude inglish [sic] then their chances of achieving an overall higher grade will increase. I suspect these antecedent competencies include: reading for comprehension, paraphrasing information, writing persuasive and logical arguments, selecting appropriate words to express thoughts, writing Global English sentence structures, paragraphing using topic sentences, and presenting/organising arguments and evidence in an appropriate genre such as a formal business report (Mellalieu 2007a,b,c, 2008, 2010 g,h). The foregoing are a selection of core, generic academic competencies that might well underpin success in many other business and tertiary education courses.

Consequently, a DSS that provides students with 'early warning' of the likelihood of their academic success or failure based on an early assessment of these competencies may encourage students, their instructors, and advisors to take early, proactive action to remedy deficiencies both within and outside the classroom.

The machine learning analysis also revealed unexpected insights into the level of academic performance of a student team assignment. These results will be presented at the seminar.

Results from the ReXS DSS based on 2010 data have now been made available to students enrolled in the 2011 course. Students have completed their first three weeks of assessment. Accordingly each student has been provided with feedback on the prognosis for their subsequent course grades. Students have also been provided with advice on actions they can undertake to improve their grades.

The seminar will discuss
- illustrations of the predictions made by the DSS
- how the principles underlying the DSS can be extended to other courses
- opportunities for improving the utility of the DSS to students and academic staff.


The ReXS DSS Spreadsheet
Download the .xls spreadsheet from here:

Mellalieu, P. J. (2011, April 26). ReXS: Decision Support System for Retention, eXcellence, and Success (.xls spreadsheet). Peter Mellalieu - Teacher. Retrieved April 26, 2011, from http://preview.tinyurl.com/rexsdss1-2

Download a .pdf slideshow of this video here:
http://www.scribd.com/doc/53921799/Predicting-success-excellence-an...

Speaker notes
Peter Mellalieu teaches innovation, strategy, and entrepreneurship at Unitec Institute of Technology, Auckland, New Zealand. His first Decision Support System was used for strategic planning studies including factory location, and company merger-takeover in the New Zealand dairy products manufacturing industry (Mellalieu, 1982; Mellalieu & Hall, 1983). At Unitec, he taught strategic thinking for several years using Thompson et al’s DSS-based Business Strategy Game. Peter also has a current focus on developing trans-disciplinary academic literacies through eco-innovation, eco-strategy, and eco-entrepreneurship. His recent interest in machine learning/data mining grew from his postgraduate studies in operations research and systems modeling conducted at New Zealand’s national physics and engineering laboratory and his undergraduate studies in industrial engineering and information technology.

 

References
Bouckaert, R. R., Frank, E., Hall, M., Kirkby, R., Reutemann, P., Seewald, A., & Scuse, D. (2010). WEKA Manual for Version 3-6-3. Hamilton, NZ: The University of Waikato.

Culver, T. (2010, January 21). Setting student retention goals and developing student retention strategy. Noel-Levitz. Retrieved January 23, 2011, from http://blog.noellevitz.com/2011/01/21/6-keys-setting-realistic-stud...

Culver, T. (2011). Mid-Year Retention Indicators Report for Two-Year and Four-Year, Public and Private Institutions: Benchmark Research Study Conducted Fall 2010. Higher Ed Benchmarks. Noel-Levitz. Retrieved from https://www.noellevitz.com/NR/rdonlyres/C72E0608-2024-4175-9A94-966...

Decision support system - Wikipedia, the free encyclopedia. (n.d.). . Retrieved February 3, 2010, from http://en.wikipedia.org/wiki/Decision_support_system

Guillaume, D., & Khachikian, C. (2011). The effect of time-on-task on student grades and grade expectations. Assessment & Evaluation in Higher Education, 36(3), 251-261. Retrieved from http://dx.doi.org/10.1080/02602930903311708


Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., & Witten, I. H. (2009). The WEKA Data Mining Software: An Update. SIGKDD Explorations, 11(1).


Mellalieu, P. J. (1982). A Decision Support System for Corporate Planning in the New Zealand Dairy Industry, Doctor of Philosophy in mathematics, statistics and operations research,. Victoria University of Wellington, Wellington, New Zealand. Retrieved from http://nzresearch.org.nz/index.php/record/viewSchema/21040/3

Mellalieu, P. J. (1999). Creating the A+ assignment: A project management approach (Working Paper). Massey University, College of Business. Retrieved from http://web.mac.com/petermellalieu/Teacher/Blog/Entries/2007/10/21_C...

Mellalieu, P. J. (2007a, June 3). Let’s all learn and teach Global English in our business schools! Peter Mellalieu - Teacher. Retrieved April 21, 2010, from http://web.mac.com/petermellalieu/Teacher/Blog/Entries/2007/7/3_Let...!.html

Mellalieu, P. J. (2007b, July 6). The Massey Writing Across the Curriculum Model: A manifesto for the renaissance of an international business school? Retrieved October 11, 2009, from http://web.mac.com/petermellalieu/Teacher/Blog/Entries/2007/7/6_The...
Mellalieu, P. J. (2007c, October 18). Model answer: A “Five Paragraph” essay in management. Retrieved July 27, 2009, from http://web.mac.com/petermellalieu/Teacher/Blog/Entries/2007/10/18_M...

Mellalieu, P. J. (2008). Writing to learn argument and persuasion: A “Trojan Horse” for promoting the adoption of “Writing Across the Curriculum” (WAC) principles (Working paper). Auckland, NZ: Unitec New Zealand Centre for Innovation & Entrepreneurship. Retrieved from http://web.mac.com/petermellalieu/Teacher/Blog/Entries/2008/7/31_Wr...(WAC)_principles.html

Mellalieu, P. J., & Hall, K. R. (1983). An Interactive Planning Model for the New Zealand Dairy Industry. Journal of the Operational Research Society, 34, 521-532. doi:10.1057/jors.1983.119

Mount, J. (2009, August 19). A Demonstration of Data Mining. Win-Vector Blog (The Applied Theorist’s Point of View). Retrieved December 4, 2010, from http://www.win-vector.com/blog/2009/08/a-demonstration-of-data-mining/

Paper for PedR meeting: The effect of time-on-task on student grades and grade expectations « DrBadgr. (n.d.). . Retrieved April 12, 2011, from http://drbadgr.wordpress.com/2011/04/11/paper-for-pedr-meeting-the-...

Springer, S. P., Franck, M. R., & Reider, J. (2009). What Do Selective Colleges Look for in an Applicant? The Academic Record. Education.com. Retrieved April 12, 2011, from http://www.education.com/reference/article/selective-colleges-appli...

Springer, S. P., Reider, J., & Franck, M. R. (2009). Admission Matters: What Students and Parents Need to Know About Getting into College (2nd ed.). Jossey-Bass.

Thompson, A. A., Stappenbeck, G. J., Reidenbach, M. A., Thrasher, I. F., & Harms, C. C. (n.d.). Business Strategy Game Simulation. Retrieved July 7, 2009, from http://www.bsg-online.com/

Weka 3 - Data Mining with Open Source Machine Learning Software in Java. (n.d.). . Retrieved November 21, 2010, from http://www.cs.waikato.ac.nz/ml/weka/

Witten, I. H., & Frank, E. (2005). Data mining: practical machine learning tools and techniques (2nd ed.). Morgan Kaufmann.

Author’s progress results
Mellalieu, P. J. (2010a, November 29). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 1: Statistical analysis. Innovation & chaos ... in search of optimality. Retrieved November 29, 2010, from http://pogus.tumblr.com/post/1724117822/predicting-success-excellen...

Mellalieu, P. J. (2010b, November 29). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education: Part 1: Statistical analysis - Figures. Innovation & chaos ... in search of optimality. Retrieved November 29, 2010, from http://pogus.tumblr.com/post/1723717009/predicting-success-excellen...

Mellalieu, P. J. (2010c, November 30). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 2a: Preparing for data mining. Innovation & chaos ... in search of optimality. Retrieved December 3, 2010, from http://pogus.tumblr.com/post/1983876991/predicting-success-excellen...

Mellalieu, P. J. (2010d, November 30). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 2a: Preparing for data mining - Figures. Innovation & chaos ... in search of optimality. Retrieved November 30, 2010, from http://pogus.tumblr.com/post/1983667421/predicting-success-excellen...

Mellalieu, P. J. (2010e, December 3). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 2b: Data mining with WEKA Explorer. Innovation & chaos ... in search of optimality. Retrieved December 3, 2010, from http://pogus.tumblr.com/post/2079452955/predicting-success-excellen...

Mellalieu, P. J. (2010f, December 5). Predicting success, excellence, and retention from early course performance: a comparison of statistical and machine learning methods in a tertiary education programme: Part 3a: A comprehensive time-staged model - Figures. Innovation & chaos ... in search of optimality. Retrieved December 5, 2010, from http://pogus.tumblr.com/post/2100181658/predicting-success-excellen...

Mellalieu, P. J. (2010g, December 6). A Decision Support System for predicting success, excellence, and retention from students’ early course performance: a machine learning approach in a tertiary education programme in innovation and entrepreneurship: Part 1: Project summary. Innovation & chaos ... in search of optimality. Retrieved January 10, 2011, from http://pogus.tumblr.com/post/2110849868/a-decision-support-system-f...


Mellalieu, P. J. (2011a, April 12). A Decision Support System for predicting Retention, eXcellence, & Success: The Dashboard for ReXS. Innovation & chaos ... in search of optimality. Retrieved April 12, 2011, from http://pogus.tumblr.com/post/4548163448/image-via-wikipedia-a-decis...

Mellalieu, P. J. (2011b, April 12). A Decision Support System for predicting Retention, eXcellence, & Success: Feedback to my students using ReXS. Innovation & chaos ... in search of optimality. Retrieved April 12, 2011, from http://pogus.tumblr.com/post/4550061605/a-decision-support-system-f...

RSS

Welcome! Feel free to read and add to our events and blog posts, invite new members, and use the material found in our Resources tab above.

Poll

Members

Latest Activity

Videos

  • Add Videos
  • View All

© 2017   Created by Edward Flagg.   Powered by

Badges  |  Report an Issue  |  Terms of Service