For years, DevLearn has been one of my favorite conferences. This year, however, I experienced it from a new vantage point: On the exhibitor side as a member of the Rustici Software team. Joining the booth team for the first time was truly a rewarding experience. Hearing attendees describe their pain points and then introducing our products as a solution was incredibly satisfying. The entire event was buzzing with energy, but one major theme that stood out was the evolution of AI.

The noticeable shift in AI

AI, like previous years, was a major focus at this year’s DevLearn, but there’s been a very noticeable shift in its direction and application. Previously, it had been very focused on content generation, like creating courses, quiz questions, scripts, videos, etc.

Now, the application of AI has shifted to the learner. Platforms are doing a big push around skills and job roles, and effectively mapping and distributing training precisely to those needs. This has always been a holy grail of sorts. If you’ve heard terms like “dynamic content” or “just-in-time training,” those were attempts to deliver and modify content based on which learner was taking it.

Bridging the gap between data and content

To do this kind of skills or role-based dynamic learning requires a large corpus of data around the learner, intimate knowledge of the content and a system that can bridge the two. With AI, it feels like we finally have a system that can successfully bridge the gap between data and content. On one side, we have LMS and HRIS applications that are able to leverage everything known about the learner and deliver relevant courseware to them. On the other side, we have tools like our Rustici Generator product that can break down learning standards content for use in these systems.

So where do learning standards fit in?

Whenever we talk about data in eLearning, I always immediately think of xAPI. We had a lot of great talks around xAPI at our booth, and with all the insightful conversations around it, there is a universal acknowledgement that it hasn’t taken off like many of us had hoped. And this is in no way discounting the myriad of amazing xAPI-based learning out there. But I’ve always felt the problem in framing xAPI was both talking about it as a learning standard and a product. Want dynamic training? Build it with xAPI! As if the act of using xAPI automatically gave you something more than its SCORM-y counterparts. Phase 1 put xAPI in course, Phase 2 ? Phase 3 Profit

We, of course, know this not to be true: Populating xAPI statements into your content has always been the straightforward part. It’s what to do with all that data, what is actionable, what is tangible, that is the hard part. And for years, the actionable has had a high technical overhead to implement. Using xAPI for a course leaderboard? Better know some javascript and database SQL and have access (and approval) to use a tech stack.

What’s next?

I was thinking on the plane ride home though that maybe xAPI is the right answer that arrived too early. The standard already has robust models around both the learner and the content. What has been missing is what AI now offers – a way to easily and scalably do things with the data. xAPI can, and absolutely should, have a place in this new learner-based AI learning world.

If DevLearn proved one thing, it’s that the future of personalized training has arrived. For years, xAPI provided the perfect language to capture complex learner data, but we lacked the scalable intelligence to utilize it and provide the necessary context for skills-based learning. Now, AI is the horsepower that processes those vast xAPI statements, instantly filling the contextual gaps to enable truly proactive, just-in-time training. The long-sought-after dynamic learning engine is finally here.

If you weren’t able to attend DevLearn this year, we’d love to catch up and share more about what we learned and discuss how AI and xAPI can transform your training strategy. Reach out to our team!

Stephen fell into programming completely by accident after a job asked him to do some programming work. Here he is 20 years later, a principal engineer working across teams and with a focus on AI efforts.