March 2026 was a wild month. Nobody is yet able to assess the real consequences of the war in the Middle East started by Israel and the USA, but there is no doubt it will affect all of us around the world, at least throughout 2026. I also have no doubt it may significantly impact the transportation industry and, as a result, telematics technology companies working within it. Based on my personal experience over the past 30 years, rising fuel prices have always led to increased demand for telematics services. But who knows how it will play out this time.
Anyway, flespi showed 100% uptime this month. The only message in our NOC was related to Microsoft blocking access to the tenant used for login on flespi.io. This led to our decision to part ways with Microsoft, and unfortunately it also means that login via Microsoft accounts is no longer available. You can still access your flespi account using other authentication methods such as Google, GitHub, or Facebook. And of course, within realms you can use any identity provider.
For assets, about a year after the initial release, we finally removed the experimental tag. Compare this to AI integration, which was moved out of experimental just a month after being introduced – a huge difference in pace between AI-driven and conventional services.
Speaking of AI, we also released dedicated skills for various coding agents to further improve their awareness of flespi. We see that users are increasingly working with flespi through AI agents and are now able to build fully customized telematics applications within days or weeks. This is a significant shift and is already changing the industry. Previously, implementing even a subset of such ideas could require a team of developers and budgets starting from $20,000. Now it can be done by a single person using the right set of AI coding tools connected to flespi via skills and MCP, which acts as a safe, telematics-focused data layer.
As promised, the initial version of the tacho-file-parse plugin was released in March. We expect some minor changes in the JSON output during follow-up work in April, but overall you can already start experimenting with data analysis and visualization on top of it.
We integrated two new protocols: selectcam and astra-telematics.
To accelerate further development of video telematics protocols, we analyzed usage patterns across our user base and decided to deprecate pre-configured cameras in device configuration. This simplifies video telematics API commands by allowing all channel, video, and audio arguments to be passed dynamically as part of a queued command instead of being hard-coded as camera. Please update your setup during April if you rely on this functionality.
The flespi.io panel now natively supports large-scale accounts. We tested it with up to 250,000 devices, and it remains comfortable to use. We also introduced smart AI-generated filters for quick navigation between items – for example, showing devices reporting specific parameter within certain value ranges or matching particular configuration settings.
One more useful enhancement from our UI team is the ability to switch between subaccount chats in HelpBox. This is especially effective if you want to separate your support team communication with Codi into dedicated chats while granting it read-only access to all your flespi entities.
The open-source MQTT Board application maintained by our team was upgraded from Vue2 to Vue3, and we started adding various small enhancements such as panes rearrangement.
We also published a standalone application that visualizes all available flespi MQTT topics in a hierarchical structure. It allows you to quickly navigate through hundreds of flespi-specific topics and find exactly what you need. This application is also integrated into the flespi.io panel – when viewing entity properties, you can now see all MQTT topics available for subscription to consume its data or events.
Codi set a new record in March by covering 96% of all communication with our users. To be honest, there is now almost no practical need for human-to-human communication on our side – whether for commercial or support questions. Codi handles it faster and, in many cases, better. To save even more time for our engineers, we also operate Birdy – an AI protocol engineer that acts as a connecting layer between Codi and our engineering team, and performs its role very efficiently as well.
Human engineers are now mostly involved in resolving non-standard cases, making decisions based on a wide range of input signals, past experience, and personal relationships. At some point, we may automate this layer too, but for now it remains an interesting and important part of the work to stay involved in. Most of the investigation, communication, and implementation is handled by AI. And because we do not rely on third-party harnesses but build our own, our AI performs this work very well – and keeps improving.
This level of delegation allowed us to double our output with the same team. At the same time, our engineers are now focused on truly complex tasks, which were previously difficult to prioritize due to the constant flow of routine protocol updates required to keep the platform aligned with hundreds of manufacturers continuously releasing new device firmware, updating protocols, and so on.
