Powerful NAPLAN insights with Bloum's new tool for schools

NAPLAN results have now been released in every state in the country, but many schools still wrestle with getting practical insights from the data in its raw form. In fact, 78% of teachers don’t believe NAPLAN results will help them identify student areas for improvement. 

When used correctly, NAPLAN results can provide powerful insights that genuinely improve classroom performance, with the potential for personalised learning journeys for each student. Unfortunately, as most schools largely ignore NAPLAN in their planning, all of that data and information goes to waste. 

To help bridge this gap, Bloum is rolling out new features this month to include automated NAPLAN insights. Staff members can simply let the platform perform its analysis and utilise its insights in the classroom. Our press release details more of these features. 

 

The difficulty of interpreting NAPLAN results

The first roadblock using NAPLAN in the classroom revolves around interpretation — do you know what the results are saying? 

Unfortunately, NAPLAN Scale Scores aren’t a simple dataset. Without context, a NAPLAN Scale Score doesn’t mean much by itself; it’s hard to determine if a Scale Score of 500 in reading is a positive or negative result. The only way to effectively evaluate Scale Scores is to compare them against school, state and national averages. 

Additionally, when using the scale, it’s difficult to understand individual student growth between tests. Students usually become more confident and competent in their skills as they grow up, but how much improvement is expected? 

A student who achieves a difference of 45 in reading Scale Scores between Year 3 and Year 5, for example, has achieved below-average growth, but the same difference between Year 5 and Year 7 can be considered very healthy progression. 

Bloum works to address these challenges by maintaining a visually-friendly user interface that can be easily read by non data-native teachers. You can find a sample screenshot below: 

Examples of NAPLAN insights 

After a school can intelligently and efficiently interpret NAPLAN results at scale, what happens next? Are there useful insights that can be lifted out and reflected in the classroom? 

The following examples show the potential for how powerful NAPLAN insights can be. These are gathered from the Grattan Institute Report in 2016 in affiliation with organisations including Google, PwC, Ernst & Young, Westpac, and many others. 

While all classroom improvement must be eventually led by capable and enthusiastic teachers, it’s clear that clear NAPLAN results can truly improve a teacher’s understanding of their class’s performance. In fact, what is sometimes lost in the NAPLAN conversation is how powerful the system can be — very few countries around the world have nation-wide, cohort-by-cohort data delivered via a standardised series of exams. 

 

Student bands don’t show steady, consistent progress 

Students do not maintain the same level of progression as they go through school. Most students in Year 3 attain a relatively close level of achievement to their peers, with 60% of them maintaining a 2.5-year range.  

However, that range explodes out to 5.5 years by Year 9, and typically reaches a 7-year spread. The top 10% of students in Year 9 are about eight years ahead of the bottom 10%.  

That makes the Year 9 classroom environment very difficult to manage for teachers, with no general ‘average’ to teach towards. 

It also means students who are high achievers and low achievers can find the classroom unfair or unreceptive to their needs. 

 

Low achieving students fall behind faster 

So where does the widening gap come from? Unfortunately, research shows that much of it consists of a greater gap as low achievers fall further behind as they advance through school.  

This idea can be difficult to see without understanding metrics such as Bloum’s new Years of Progress feature (see below), because the findings are counterintuitive. 

The raw difference in NAPLAN Scale Scores suggests that the gap actually narrows between high and low achievers; low achievers gain 211 points between Year 3 and Year 9, while high achievers only gain 156 points. 

However — in a nod to the difficulty of interpreting NAPLAN scores — a gain of 50 points represents different meanings for different Scale Scores. 

Low achievers are 2.67 years behind their high-achieving peers in Year 3, but a further 12 months behind by Year 9 at 3.67 years in arrears. For more, see our Years of Progress metric below or in the press release. 

 

Higher achievers in disadvantaged schools show the biggest learning gap 

And in another concerning sign of the widening inequality gap, it’s bright students in disadvantaged schools that show the biggest loss in potential. 

Students in disadvantaged schools who are high NAPLAN achievers in Year 3 will make approximately 2.5 years less progress than their similar peers in advantaged schools. 

And the extent of this gap is highly worrying. Overall, from Year 3 to Year 9, these high achievers in disadvantaged schools actually make less progress (5.67 years) than low achievers in advantaged schools (6.83 years). 

Importantly, the study does not suggest that staff members in disadvantaged schools are cutting corners or failing in their job. 

Rather, the data should point to how further support is needed in disadvantaged schools to help close the gap. 

 

How Bloum comes in 

If schools can use the NAPLAN data at their disposal to draw these kinds of insights, they can implement powerful changes in the classroom. As such, we have developed these two new metrics to help teachers genuinely interpret and utilise NAPLAN data. 

An important consideration was ease of use: during these past 20 months of pandemic-affected online learning, 75% of teachers worked more hours than usual, so they don’t need more data analysis on their plate. 

As such, these two new user-friendly metrics will automatically help teachers interpret the standard NAPLAN Scale Score. In turn, this means teachers can help their students with personalised teaching that addresses their specific needs.

This image has been taken from the platform and features a sample student’s scores:

Sample Student scores on Bloum

The Equivalent Year Level metric 

The first new feature we’re rolling out is known as the Equivalent Year Levels metric. This converts any given Scale Score and shows you what year level the average student should be in to perform at this level. Using this approach, a teacher can instantly know whether their student is overperforming, underperforming, or performing as expected. 

Here’s an example that illustrates the usage of the Equivalent Year Level: 

Jett is in Year 9 and has achieved a NAPLAN Scale Score of 624 in Reading. Using Bloum’s new feature, this score is translated to an Equivalent Year Level of 11.8. That means Jett’s writing skills are nearly two years more advanced than his peersHis teacher Mrs Bloum is then able to take this knowledge and challenge Jett to read more difficult texts or chapters for homework. Without this metric, Mrs Bloum only knows that Jett is a bright student but cannot measure his true ability.  

The Years of Progress metric 

Complementing the new Equivalent Year Level rating, we have a Years of Progress metric that shows how many years of growth a student has achieved between NAPLAN tests. Schools commonly hope to guide one year of growth for one year of schooling, but an individual’s progression usually isn’t linear. 

As above, an illustration can better explain the use of the Years of Progress metric: 

Jett has also achieved a NAPLAN Scale Score of 586 in Writing, which represents an increase of 76 from his equivalent score in Year 7 of 510. According to the Years of Progress metric, this shows that Jett has improved by 4.1 years of growth over these exams, which far exceeds his expected growth of two years. In fact, in Year 7, Jett was considered a below-average writer, but now he is approximately one year ahead of expectations. His teacher Mrs Bloum is therefore able to recognise and reward him for his hard work over high school. Without this metric, it would have been difficult for Mrs Bloum to recognise Jett’s overperformance, because she had not taught him in Year 7 and Year 8. 

 

About Bloum 

Bloum is a learning analytics platform that interprets school data for your staff members, allowing non-data-savvy teachers to still enjoy the power of analytics and data. The platform actively pushes strategies and suggested recommendations for teachers to guide students, giving them the confidence to make data-driven decisions. 

If you’re interested in learning more about how Bloum’s new NAPLAN features can help your school, talk to us today for a free demo of the platform.  

 


How data improves the student experience

A school that successfully leverages the power of data and analytics can improve their decision-making, deliver successful learning outcomes, and — above all — elevate the student experience. Data enables the school to capture each student’s unique background, strengths, likes, passions, and favourite subjects, which then allows teachers to adjust their teaching to each student’s satisfaction. 

But teachers can often struggle when attempting to use and interpret the enormous amounts of data at their disposal. Teachers can face technological barriers in using data or information management platforms, and they may not have the digital literacy skills to correctly interpret the underlying student issues behind the data. Plus, at the school level, the structures and processes in place can either enhance or hinder the data collection process. 

Schools therefore need to find a way to solve the issue for staff members. The Bloum platform, for example, consolidates data from various sources, without a teacher needing to analyse it and identify their own conclusions.  

Once this challenge is overcome, data is an invaluable tool that empowers school-wide decision-making. Used correctly, data can improve learning outcomes and visibility for students, parents, teachers, school leadership teams, and relevant stakeholders. 

 

Why data is essential in the 21st century 

In the 21st century, student data is no longer restricted to a list of end-of-term test scores for each student. Instead, data should capture attendance, behaviour, learning styles, and ongoing formative performance at any given point in time throughout the term. 

Without this kind of data, teachers make decisions based on gut feel, reputation, heuristics, and incomplete assumptions. While a teacher’s instincts should never be overlooked, it’s important to supplement their personal experience with accurate, timely data to enhance student outcomes. 

Teachers, for instance, often find it easy to pick out students who are severely struggling and need further assistance in the classroom. It’s also simple for them to identify outstanding students who can be further challenged beyond the prescribed syllabus. 

But what about the grey area of students in the middle? Teachers without data (and the analytics that interpret them) may not understand which student needs what kind of assistance. Often, students who are quiet strugglers are often overlooked. And if the classroom data fails to capture year-on-year growth across multiple classes, these quieter students can end up cruising under the radar for multiple years, never receiving the hands-on assistance they require. 

Ultimately, data goes hand-in-hand with a teacher’s expertise — it never replaces it. Especially in the wake of lockdown-affected online learning, student growth and wellbeing are incredibly important, and we can’t replace it with purely data-driven, exam-focused learning. However, it’s equally important that we complement the personal side with more analytics, enabling teachers to really understand how they can best help their students. 

 

What data is good data? 

But not all data is able to lead to positive student outcomes — generally, you can split it up into good data and bad data. Because Bloum’s analytics platform needs to utilise data to deliver any insights, we’ve had a long think about the distinction between good and bad data. Ultimately, we boiled it down to two determining factors: 

  • How often the data is recorded, and 
  • How accurate the data is 

It’s very important, for example, for schools to record student attendance data every day. Attendance data records that are haphazardly noted make it difficult to draw any confident conclusions. Meaningful improvements can’t be made if teachers unsure whether their student really failed to attend class on a certain day or whether they just forgot to record it. 

However, beyond that, to make it easier for teachers to record data accurately and frequently, schools need to build reliable, healthy processes that govern data record keeping. It’s also essential that teachers stick to these processes. 

One school we’ve worked with had a problem with the accuracy of their attendance data. Upon further investigation, we discovered the real problem: when students showed up to class late, teachers often didn’t ask them to return to reception to get a late notice. Instead, they would carry on in the class. This is an example of how non-compliance with underlying processes leads to bad data. 

Teachers are more overburdened than ever in the wake of the coronavirus, with 75% of secondary teachers reporting that they were working more hours during pandemic-affected online learning periods. Effective backend processes for teachers to easily can follow relieve the burden on them. To really make sure the school records good data, the leadership team should look to incentivise teachers to follow data practices. 

 

How data enhances the student journey 

So what happens after schools have obtained their good data? In fact, for many schools, the problem starts here. In the modern learning environment, schools are frequently inundated with too much data that they don’t know how to interpret. 

The glut of data means there are often powerful hidden insights that are missed. The data may communicate a message, but it doesn’t matter if that message is lost. The important part is to use and interpret the data to improve each student’s experience: their personal journey, growth, and learning outcomes. 

Some schools then put the onus on teachers to become more data-savvy and take on additional work. However, here at Bloum, we don’t think this is the best way. With research showing that teachers are more stressed than before and worked additional hours due to COVID-19, we think the opposite approach is better — by making the data work for teachers. 

For instance, Bloum’s cloud-based platform analyses and interprets the data for teachers, pushing out recommendations and allowing them to only see what’s most relevant. It also collates the data in a readable format, allowing teachers without data-native skills to still leverage the insights of a more robust, data-driven approach. With a data interpretation platform like Bloum at the school’s disposal, the leadership team can then undertake decision-making with confidence that they genuinely enhance the student journey. 

Importantly, with its ability to interpret data analytics, Bloum empowers schools to incorporate formative, ongoing assessments to track student growth at any given point during the semester, leading to greater visibility on student progression that complements summative end-of-semester exams.  

Finally, data represents an objective, reliable resource that students can engage with directly, offering them the opportunity to set their own goals. It truly puts them at the centre of the learning experience. As the Data Quality Campaign puts it, data allows students to say: 

I know my strengths and where I need to grow. I can shape my own education journey. 

 

How Bloum empowers students and teachers 

For non-data native staff members, Bloum uses analytics to help interpret the data and push out actionable recommendations for them to improve the student experience. By drawing on the data set within schools, Bloum enables more informed decision-making to improve student learning and progression.  

With its powerful predictive insights, Bloum takes away the burden from teachers by performing the work for them — the platform will mine through the information for relevant patterns.  It can then help teachers build out an individual learning plan and a proactive roadmap with next steps to guide student progression. 

If you’re interested in learning more about how to use learning analytics to empower student progression, contact us today for a free demo of the platform. 


Learning analytics for a modern world

Coming off more than 18 months of hybrid learning and online learning fatigue, student wellbeing and mental health have never been more important.

Schools must do their best to create a supportive, nurturing learning environment for all students. Unfortunately, modern research shows that traditional approaches to assessments aren’t doing the job.

They create additional stress, don’t lead to the required educational outcomes, and fail to capture each student’s personal learning styles.

 

What does the research say?

Leading research by the Gonski Institute over the past two years have pushed for the introduction of ongoing assessments over traditional, end-of-year exams. These ongoing assessments can create a better idea of learner profiles beyond exam scores — including the way students learn and how they apply their knowledge.

The elimination of exams would also move away from ‘high-stakes’ testing, which are known to cause stress, anxiety and unhappiness for students.

Central to the development of ongoing assessments must be the integration of technology and learning analytics into the curriculum. These kinds of analytics can help teachers see the holistic student journey, including learning behaviour and areas for improvement.

That way, they’ll be able to give personalised feedback to nurture the growth of each student. Notably, ongoing and personalised assessments should significantly improve equity in the classroom — providing additional assistance to students from disadvantaged backgrounds.

This kind of growth- and student-focused learning (as opposed to rote learning designed to pass exams) is especially critical after two consecutive pandemic-interrupted school years. Year 11 student Ahelee Rahman, writing in The Age, sums up the point beautifully:

As well as studying, school is where we learn to communicate, collaborate, build friendships, take leadership roles and be a part of a community — the learning that doesn’t feel like learning. And this is one of the most important parts of school.

 

How traditional end-of-year exams fall short

In their 2020 report on the NAPLAN, the Gonski Institute showed that NAPLAN has only resulted in improvements for primary school students but delivered no discernible progress in secondary schools.

The writing test was a particular point of concern: student writing performance has not improved in Years 3 and 5, and even declined in Years 7 and 9.

The latest Gonski Report recommends moving academic assessments away from end-of-year examinations and towards ongoing progress assessments. To go hand-in-hand with this, teaching should be personalised for each student, ensuring each individual grows and progresses in a healthy, meaningful manner. Leveraging a learning analytics platform is crucial to informing and providing insights into how to encourage this growth.

Data and analytics are somewhat dirty words in the education sector — it often makes you think of schools turning into education factories and students losing their individuality. But it’s inflexible examination systems like the NAPLAN that often lead to this kind of thinking, as they emphasise scores and data over student wellbeing, individuality, and holistic performance.

NAPLAN uses aggregate test scores to measure the overall performance of the school, which is then conveyed back to parents. However, this approach fails to take into account student-centric growth and learning.

Instead, all students are jumbled together within each school. To create a truly student-centric learning experience, teachers should be able to identify personalised learning journeys for students.

 

A more holistic approach to measuring education

The Gonski Institute report recommends the adoption of a new national assessment system with ongoing assessments. Assessments would then become more holistic, skills-focused, and teacher-led rather than government-led, reducing the emphasis on memorising and passing tests.

Spread out over the full school year, these ongoing assessments would provide several benefits to students:

  • Lower stress levels with no ‘high-stakes’ examinations
  • A more holistic idea of student performance beyond exam scores
  • More insightful data into student success

Ongoing teacher-led assessments then provide an opportunity to create individual learning plans for students. The NAPLAN occurs once every two years for each student, meaning there is limited scope to benefit student learning.

Instead, the NAPLAN mainly benefits government and parent visibility. But truly individualised ongoing assessments will allow students to learn in a manner that genuinely caters to their needs and maximises their own potential.

The idea of moving away from traditional end-of-year exams is already making rounds across the education sector, even at Years 11 and 12. South Australia has recently decided to implement learner profiles for each student as an alternative to the ATAR.

Understanding each student’s personal learning style can provide additional benefits to each child, not least because it moves away from traditional ideas of student success associated only with study scores and exam results.

 

How a learning analytics platform helps you

In the new ideal education environment of learner profiles and ongoing assessments, traditional ideas of ‘teaching to the test’ become far less emphasised. Instead, more employment-ready skills can be elevated, including how to creatively apply ideas, how to research, and how to work in a team.

Of course, this kind of teaching represents a learning curve to staff members and will be more difficult than evaluating exams to a certain grade. That makes analytics critical to facilitating the change, simplifying the process for teachers, and letting them understand whether their new approach is working.

However, this may also mean teachers need to upskill to learn how to analyse data. An effective analytics platform should be able to do all the work for you, letting staff members focus on their teaching.

Bloum, for example, pulls out educational insights by using powerful machine learning to consolidate data from various sources, without the user needing to analyse the data to extract their own conclusions.

Built on a modern, cloud-based platform, Bloum is designed to allow teachers to get an accurate snapshot of how each student is performing at any point during semester.

The benefit of obtaining these insights during each term rather than at the end of a reporting period means teachers can more actively assist students at the exact point where they need help.

These analytics should be able to capture more of what happens in the classroom beyond test results. For example, education analytics that track student behaviour, progress, learning tendencies, and areas for improvement allow for the development of individualised learner profiles, boosting student happiness and progression.

Education analytics is crucial to making the new system of ongoing assessments work. Using that data intelligently and turning them into those personalised plans can be quite difficult for teachers.

A platform like Bloum eases the transition by doing all the work for teachers, allowing staff members to focus only on their teaching.

 

What next for schools?

If you’re interested in learning more about how to use learning analytics to empower student progression, speak to the team at Bloum today.