Pages

Wednesday 29 September 2021

A treasure trove of insight on the English school assessment sector


If you’ve read any of my blogs before you’ll know I’m a data geek. However, the first blog I wrote (in 2014) contained no data whatsoever. Rather, it was mostly a grumble about the lack of good quality data in education. 

Then I managed to get hold of data on the school Management Information System market, so I started focusing on that in my blogs. This made some MIS-market watchers happy, but friends in other sectors would contact me and say things like “hey, we don’t care about MIS as much as you, so can you do the same thing for our sector?” And sadly, the answer was invariably “no”, because I just didn’t have the data to do justice to the question.


Until now, that is.


Because, for those of you who track English school assessment, the very good news is that I’ve found my dream data partners in the guise of the good people at Education Intelligence (the team behind Teacher Tapp). I don’t think this is as widely known as it should be, but they have this super-cool thing called the Education Intelligence Tracker, which keeps tabs on what school staff think about education suppliers. They track *literally* hundreds of companies by asking the following questions to several thousand teachers at a time:


  1. Have you heard of the following company?

  2. Do you currently use or have you used the following company in the past 6 months?

  3. Would you recommend the following company to a colleage?


And so, joy of joys, my consultancy (Edtech Experts) and Education Intelligence have worked together over the past few months to publish a data-rich insights package focusing on the English school assessment sector. Here’s what we’re offering:


  1. A dataset of the 57 companies covered by Teacher Tapp with a significant assessment component to their offer. Yes, you read that right, 57 companies! The neatly structured dataset includes answers to the questions above at overall level, and also broken down by lots of sub-categories (e.g. primary vs secondary, teacher vs middle leader / senior leader etc). This is the first time that Teacher Tapp have ever made this much of their data available in one purchase, and as a consequence the bundle is brilliant value for money.

  2. A report written by Edtech Experts both analysing Teacher Tapp’s data and also incorporating qualitative insights from interviews with a wide range of sector experts. So as well as the data analysis you get sections on how the assessment market is segmented, the big trends in English school assessment, how schools procure, a market sizing exercise, and even a “deals summary’ covering signification investment and acquisition activity at the focus companies over the past 3 years. 


The package will therefore help you to understand:


  • How the English school assessment sector is structured and segmented 

  • Which are the growth areas 

  • Which assessment products are popular (and which ones are not)

  • Strategies investors are using to expand in the assessment space

  • How schools procure assessment products


If that’s all still a bit abstract, here are a few nuggets that made me raise my eyebrows as we put the report together, and which you’ll be able to explore in much more detail by purchasing the package:


  • The combined market share of Microsoft Teams and Google Classroom in English schools is now over 100%! That means most schools are seemingly using one of the two, and some are using (or experimenting with) both.

  • Single subject and quizzing products are very popular with teachers; assessment management platforms (including MIS) are not.

  • Standardised assessment is big business.


Special thanks go to Ed Tranham of The Assignment Report, who was hugely helpful in pulling together the deals summary - if you don’t have a subscription, I thoroughly recommend them as a source of news on the business of education.


So, if you’d like to know more about our insights package on the English school assessment sector, or get an idea of pricing, drop me an email at josh@edtechexperts.co.uk, or DM me on LinkedIn or Twitter.

Saturday 24 July 2021

MIS MARKET UPDATE (Summer 2021): SIMS dip below 70% market share

Disclaimer: I have past and present commercial relationships with many MIS vendors, including an ongoing involvement with Compass, an Australia-based MIS that is launching in the UK  Nonetheless I aim to write this blog impartially, from the perspective of a neutral observer. This matters to me - it's basically the blog I wish had existed back when I was a MAT MIS commissioner trying to get a handle on all things relating to MIS and edtech. I also now provide MIS market datasets and reports as a service and offer free, informal consultations on MIS procurement to schools and MATs.  If you would like to discuss any of this, contact me on Twitter or LinkedIn.   

No time for small talk this term - I'm going to get straight to the point. I've just got hold of the latest termly data on the School Management Information Systems (MIS) used by state schools in England. Here are the headlines:

  1. SIMS dipped below 70% market share when measured by number of schools. I now have eleven years of MIS market data. SIMS' market share peaked in 2012 at 84%. This summer they dropped below 70% (69.6% to be exact). It's higher when measured by pupil numbers (74.6%), but still, the direction continues to be downwards however you cut it.
  2. 528 schools switched over the past term. That's high by historical standards. We don't have exact data for 2020, because there was no summer census (damn you, COVID, for messing up my beautiful term-by-term dataset), but my educated extrapolation is that the figure for the equivalent period last year was 480-490, and in 2019 the summer term number was 430. That follows a strong number the previous term, which leads me to estimate that there will be c. 1,000-1,050 switchers in the full year to autumn 2021 when we get next term's numbers. That would be a historic high since (my) records began in 2010.
  3. Wins are still dominated by Arbor, Bromcom and ScholarPack. This isn't news, but there are some interesting subplots to the story. On which note...
  4. Arbor and Bromcom are neck and neck with secondary wins. Bromcom remains the dominant challenger with secondaries - they have 299 schools compared to 135 for Arbor. However, Arbor now match them when it comes to new wins. In 2019, Bromcom won four new schools for every school joining Arbor. But over the past term, 29 secondaries joined Arbor, compared to 22 joining Bromcom. Why the change? Well, United Learning had a lot to do with it, accounting for 17 of Arbor's wins. But still, something seems to have changed: over the past four terms, the two have been more or less equally successful with secondaries (Bromcom has 87 wins to Arbor's 84). 
  5. MAT schools are still more likely to switch, but LA schools are catching up. Around 45% of English state schools are academies, but academies (most of which are now in MATs) represent 62% of the switchers. That said, LA schools are increasingly getting in on the switching act: in Q1 of 2018 they represented just 14% of switchers, whereas last term 38% of switchers were LA schools.
  6. Pupil Asset (now sold as Juniper Horizons) won more schools than SIMS over the past term. It's been a quiet few years for Pupil Asset, so the Juniper team will be heartened to see 25 schools come their way over the past term. That's not at the level of the "big three" challengers, but it was more than SIMS managed to win over the same period (they had 18 wins). Aside from bragging rights for the non-trivial number of ex-SIMS employees now at Juniper, this also points to how  SIMS hasn't (yet) found a way of winning schools away from the challengers. Having said that, new owners Montagu have only just had their merger with ParentPay confirmed, and so it feels too early to judge the impact of the new owner's strategy on the long-term prospects of the business.
As always, here are some pretty charts which you can use to explore for yourself:

Saturday 19 June 2021

MIS MARKET UPDATE (JAN 2021 DATA): Churn, Baby Churn.

Disclaimer: I have past and present commercial relationships with many MIS vendors, including an ongoing involvement with Compass, an Australia-based MIS that is launching in the UK  Nonetheless I aim to write this blog impartially, from the perspective of a neutral observer. This matters to me - it's basically the blog I wish had existed back when I was a MAT MIS commissioner trying to get a handle on all things relating to MIS and edtech. I also now provide MIS market datasets and reports as a service and offer free, informal consultations on MIS procurement to schools and MATs.  If you would like to discuss any of this, contact me on Twitter or LinkedIn.   

I feel sorry for local news organisations. I grew up in the south west of England, and occasionally stuff happened, but more often that not the local news sounded like: 

Something something council business, bla bla squirrels are hurting trees, yadder yadder oh look a royal came to the County Fair... 

You get the idea.

Nonetheless, those dutiful regional news teams still had to put out their obligatory half hour of programming, or an unfoldable broadsheet newspaper, or whatever. They had to pretend that everything that they reported on was a big deal. The temptation to arrive on camera and just fess up to the absolute chasmic absence of news must have been overwhelming. "It's 9pm, thanks for joining us, but honestly you really shouldn't have bothered. Make your dinner and come back when Tomorrow's World will be reporting on a curious new invention called the internet." That's the kind of content I'd have respected. 

I felt a bit like that when I was preparing this blogpost. Don't get me wrong, there's some good stuff below (e.g. more schools are switching!), but at the same time, if you caught my December 2020 update on the MIS market, you're unlikely to be blown away by the newsiness of it all. Still, you're probably committed to at least a quick skim at this point, so let's get into it.

I recently received school MIS market data from January 2021. I've been requesting the termly files for a couple of years now, so I was able to combine it with data stretching back to October 2018 to analyse termly trends. What follows are my five key takeaways, accompanied by some charts, where I think they'll help to emphasise a point.

So, here we go.

1. SIMS churn is at an all time high; challenger churn is at all all-time low

The most important number for all MIS market participants is churn: that is, losses over the past twelve months divided by the number of schools twelve months ago. For established players, you want this to stay low to protect your position. For challengers, you want it to be high.

I've blogged before in my annual blogs about how SIMS churn has grown steadily since 2014 when it was under 1%. The term-by-term data shows this trend continuing. The following chart shows annualised churn using a rolling average of the three previous data periods for the five leading MIS. What you see is:

  • Overall churn has been hovering around 4% for a couple of years, and is showing signs of ticking upwards. 
  • SIMS churn hit 4.5% in Jan 21, the highest rate over the period (and the highest since at least 2010, when my main annual dataset begins).
  • There are two clear groups when it comes to churn: the two largest MIS (SIMS and RM), who both churn around 4-5% with churn ticking slowly upward; and the three main challengers (Arbor, Bromcom and ScholarPack), who have been churning at 0-2.5%, with churn ticking notably downwards, particularly in the most recent data set.
That last point is particularly key for those trying to understand where the market is heading. In summary, it's good for challengers! They're keeping their schools while seeing an ever-expanding pool of switchers to fight for. In an ideal world, the challengers would of course love overall churn (which for a while will still mostly mean SIMS churn) to tick up closer to, say 10%. But in the meantime, they can probably live with 4-5% of schools being up for grabs every year, particularly if their own customers staying loyal.



Another way of seeing how churn is changing is to look at the last three years of Oct-Jan switching data only, and compare like-for-like volumes. What you see is that wins are at a three-year high over that period (196 in 2021 vs 191 in 2019 and 145 in 2020). So, while there's no dramatic change, the signs do point to continued growth in switching.



2. The Key and Bromcom continue to be the destinations of choice for switchers.
Arbor, Bromcom and ScholarPack continue to be the big 3 MIS in terms of winning over switchers, so there's nothing new there. However, what is new is that Oct 20-Dec 21 was the first period I've ever seen where SIMS was only the fifth best MIS for new wins (Pupil Asset managed to win 5% of schools, compared to 4% for SIMS). Now this could be a blip - SIMS were acquired by Montagu during the period in question - and that kind of transaction is bound to lead to some short-term disruption. Nonetheless, the new management team at SIMS will no doubt be doing all they can to reverse this trend in future periods. Conversely, the Pupil Asset team (who have recently rebranded to Juniper Horizons) will be cheered to see those signs of growth after a comparatively quiet previous year or two.

Another thing to note is that, as with overall market numbers, it makes a difference whether you're looking at schools won, or the number of pupils in schools won. Bromcom won 16% of schools in the period (3rd place for wins), but 25% when measured by pupils (2nd place for wins).



3. Secondaries have (at least) two non-SIMS cloud options.
For at least a year now, Arbor and Bromcom have together accounted for c. 75%+ of secondary switchers, with a fairly even split between the two.  That's a significant change from 2019, when SIMS were stronger and Arbor had less success with secondaries. 

That's surely good news for schools, who will be increasingly aware that they have (at least) two cloud options. 



4. That said, there are plenty of other people trying to shake up the market 
Of course, that doesn't mean things won't *ever* change. The last year has seen huge upheaval in the market: as well as The Key buying Arbor, IRIS bought iSAMS and Juniper bought Pupil Asset. Faronics have been looking to grow a market presence for a few years now, and Go4Schools also have an embryonic MIS. Keep an eye on ET-AIMS, which has recently entered the market too. And of course, as I mentioned in my disclaimer, I'm also now helping Compass (an Australia-based global MIS vendor) to launch in the UK. That's a lot of additional energy being expended on trying to shake up the market...

5. You now have a choice for your MIS market analysis! 
I sometimes mention how I was inspired (and helped greatly) to get started with MIS market by Graham Reed, who was the first to produce publicly available MIS market stats, and still cranks out interesting MIS analytics in Power BI. Well, now The Key have gotten in on the act, with their very own Tableau-powered MIS market analysis

What I'm saying is: we're basically now our own sector. Perhaps someone will start a blog to analyse our market share of MIS market page views? And it can surely only be a matter of time before we get our own BETT award category?

Anyway, given this new competitive landscape, I feel the need to provide at least a couple of interactive charts as a thank you for making it all the way to the end. So to close out the blog, here is the full term-by-term picture from Oct 2018 to Jan 2021 (both by school and pupil numbers):


Saturday 20 February 2021

In edtech, AI is only the answer if you know the question

There's much talk these days about how Artificial Intelligence (AI) and Machine Learning (ML) can transform education. Century Tech have just announced Century APIs, which moves them from being a learning platform to being also a form of middleware, providing partners with access to their AI technology as a service. Sparx have been busy building a personalised learning experience for schools (though they prefer to describe their approach as "Intelligence Augmentation" rather than full AI). And Quizlet (an edtech "unicorn") have a service called Quizlet Learn, which uses ML to provide an adaptive learning experience. 

Now, I'm not an expert in the area, so this blog isn't going to be a rigorous analysis of what AI/ML is, or who's doing it best. But I am an edtech entrepreneur, and I hear lots of other edtech entrepreneurs considering whether to use AI/ML, so the purpose of this blog is to propose a way of answering that question.

Or, to put it another way, I'm asking: "what problems should AI be trying to address?". I've seen too many business plans which can be summarised as: "we're a bit like other things, but with AI, so better"; and that might be true... but why? 

Let's examine the question by using the following wildly simplistic anatomisation of the learning process:

  1. Content (knowledge, facts, ideas, skills etc) is created in the form of lesson plans, videos, worksheets, quizzes and such.
  2. That content is shared somehow with a student.
  3. The student attempts to absorb the content using the materials provided.
  4. A teacher (or software) marks the student's work.
  5. (Sometimes) the student revises or revisits that content.
  6. (Sometimes) there is a final test to assess whether the content was correctly learned.

So, how could AI help with each of those steps?

Starting with point 1, I don't think many people are using AI to create content, so a pretty key point is: did you create good content for the AI to have fun with in the first place? And this is my first anxiety: 

If you're an AI-powered product, and you don't have a good rationale for why your content is better than someone else's, then why should I believe your AI is going to change that? 

AI can't (yet) turn a bad module into a good one: it's not rerecording videos with better analogies, or more succinct summaries, or whatever. Some might argue that AI can help pick the preferred content type for the specific student, but that sounds a lot to me like learning styles, and I've read enough Daisy Christodoulou to be highly sceptical as to whether that's a thing.

Moving to point 2, I guess you could make a case for AI somehow informing the sharing process, but I don't see that happening anywhere in practice. Rather, people build software, and they make tonnes of choices that are then hardwired into the learning experience, which may end up mattering more than anything their AI does. Stuff like: 

  • how you log in
  • what the user interface looks like
  • how quickly pages load
  • how many clicks are needed to move between pages

Why does that matter? Well, in a recent conversation about Carousel (the quizzing platform I've co-founded) recently, an esteemed academic made the excellent point to me that the biggest difference we can make is to get someone who previously wouldn't have attempted a task to attempt it. Until then, I hadn't spent enough time thinking about how taking a student from "didn't bother" to "learned a thing imperfectly" is perhaps a more profound change than going from "learned a thing imperfectly" to "learned a thing well". And maybe the best way to do that is to make it so easy to use that even easily distracted (or low-resilience) learners give it a crack. 

So, the point is:

If you're an AI-powered product, by all means spend money on the AI, but don't forget to make the product good in all the non-AI ways that may end up impacting the learning experience too.

OK,  time for points 3 and 5. The reason why I'm lumping these together is because it's actually pretty important - and not always clear - as to whether an AI-powered product is intended to be a full curriculum product, or a revision / homework / supplementary learning tool. Now, just from a sales perspective, I think it's much easier to sell a revision/homework edtech tool than a full curriculum product, at least for the time being. But also, from a design perspective, your focus here makes a big difference to what you'd ask your AI to do. A full curriculum product is likely to significantly more complicated, with more types of content and tasks, each of which needs deep and careful thinking about how AI might help. On the other hand, if you're just a revision / homework tool, maybe your main "thing" is just questions (or videos, or pods, or whatever).

Either way, at this step the only things I can think of that the AI can do is:

  • sequence content differently
  • select different types of content (Learning styles!) 
Now sure, I can see a value in AI for sequencing... but is it transformationally better than a traditional, well-designed curriculum? I'm not convinced, yet. I guess the complexity and popularity of a course are  factors, too: there are some pretty great and well-thought-through primary maths curricula out there, so maybe AI can't add a tonne by suggesting a different sequencing. On the other hand, perhaps there's more fertile ground with more complex and less-universal areas such as A-Level Physics, for example. In any case, my point is:

If your AI is focused on sequencing content or selecting different types of materials, you need to be able to explain why this makes you more effective than a decent teacher.

Finally, let's think about points 4 and 6. Here's where I see the most potential for AI/ML. So much edtech revolves around multiple choice questions, because they're easy for a machine to mark, so I can see AI playing a major role in expanding the range of assessment types that computers can handle. Startups like Progressay (essay marking) and Lexplore (reading dyslexia assessments using eye tracking) interest me: the appeal is that I recognise the problem, and can understand - and therefore believe in - the role of AI. 

Another angle that AI/ML can help with is spaced repetition (i.e. how frequently, and at what interval, questions are asked and re-asked to help embed knowledge). This is something we've been thinking about a lot at Carousel, and we expect to introduce innovations in this area over the next year or so. This is something that really isn't intuitive to teachers; and even if it was, it's hard to have the discipline to remember to re-quiz students on a subject when you have so much new ground to cover. So I can see AI/ML playing a really useful role in the revision and embedding process. But, at the risk of repeating myself, this will only work if the content is well-designed, and your product is designed in a way that students actually want to interact with.

Finally, it's worth remembering that many of the Big Technological Leaps Forward we need to make in edtech can be achieved without AI/ML. I'm also co-founder of a MAT assessment platform called Smartgrade, built with input from the legends at Evidence Based Education. Smartgrade uses a bunch of algorithms to standardise common assessments, and also to feed back on how well designed they are. That's not AI/ML; but it is clever use of technology to automate and improve assessment accuracy. AI isn't the only way.

So in summary, I'm not saying AI can't help with edtech. Rather, I'm saying:

Your theory of AI will be most compelling if you can articulate which part of the learning process it tackles, and why that bit needs AI in the first place.


Monday 18 January 2021

I now sell things

If you're a regular reader of my blog then you're probably here either because you're professionally involved in edtech in some way and so feel obligated to keep abreast of the sector, or because you just like niche stats and charts about school information systems. Either way, I now sell products and services which may interest you.

To be more specific, I offer bespoke sector research and consultancy via my company (Edtech Experts Ltd), and also license two products:

  1. The English state school Management Information System (MIS) dataset that underpins my blog. I now have eleven (count 'em!) years of data on MIS market usage broken down by schools. The data includes a row per year for each school, with a flag whenever a school switches, plus information on the school's phase (e.g. primary/secondary), pupil numbers, postcode, type (e.g. academy/LA), MAT affiliation and MAT size (if relevant). 
  2. A jam-packed, 50 page report on the UK MIS market, which includes: 
    1. A recap of the past year, including a summary of the major acquisitions (and hoo boy were there acquisitions in 2020).
    2. A tonne of analysis on MIS market stats which goes well beyond the blog. This is my favourite bit to write so I really go to town.
    3. NEW customer satisfaction data, including some fascinating and NEVER BEFORE SEEN customer satisfaction headline data from the lovely people at Teacher Tapp, who also collect and license some very valuable data on the MIS market. Oh, and vendor app review data which is also intriguing.
    4. A taxonomy of what's in a MIS, which is a surprisingly philosophical question.
    5. Analysis of the big players in the market and their range of products / module coverage.
    6. Vendor profiles for the major MIS.
    7. Information on UK MIS procurement, including how MATs / LA schools buy a MIS.
Get in touch if you're interested in any of the above. Prices are reasonable I swear, and I do hefty discounts for MIS vendors (because they're nice people and the report is better for having them in the loop) and researchers / smaller companies (because I've been there). I'll also gladly do free, no-obligation sneak previews of both products on a call, if that helps you understand what I've got. 

Oh, and more reports are coming soon covering other areas of edtech, so watch this space!

For more information, give me a shout on Twitter, LinkedIn or via email.