Accelerating Inferencing Using HLS Hackathon
Energy efficiency is essential for edge devices, especially those powered by batteries or harvested energy, making low-power AI and machine learning inference a real challenge. In this hackathon we focused on accelerating inferencing using HLS, the mission was to build a high-efficiency hardware accelerator that delivers accurate predictions with minimal energy per inference.
The Hackathon has ended. The results will be on the leaderboard soon!
-
MNIST Hackathon
-
MNIST Overview
Overview
Energy can be critical in edge devices. Systems that are battery powered or rely on harvested energy need to be as efficient as possible. Which can make deploying inferencing on these systems challenging. Inferencing is notoriously power hungry. In this hackathon we focused on building an efficient inferencing accelerator. The winners were those that built the most efficient inferencing system, one that delivered predictions with the smallest amount of energy per inference. Implementations met strict performance, accuracy, and area requirements, too.
Oh, and there was a time limit, 30 days to complete the masterpiece of efficient engineering.
-
-
Leaderboard
-
Leaderboard
-
-
Leaderboard Insights
Hackathon Podcast
Catch Russell Klein and Cameron Villone on our hackathon podcast that covered submissions, tips, and exclusive updates during the HLS 2025 Hackathon.
-
The Algorithm
-
The Algorithm
We used the MNIST handwritten character recognition algorithm. Sure, it’s old and it’s small, but it is the one Yann LeCun got started with (and he’s now the chief AI guy at Meta).
We picked this because it's small enough to retrain in a few minutes, it's practical to run in logic simulation, and it can be characterized quickly.
-
-
The Starting Line
-
The Starting Line
Participants got given a virtual machine equipped with all the tools and IP needed to build and characterize an ASIC implementation of their inferencing accelerator, courtesy of Siemens EDA. They started with RocketCore RISC-V design, and a bare metal application that runs the MNIST algorithm. The job was to make the inference run faster than any software implementation could possibly go, all while your design sips tiny amounts of energy to get the job done. It was a unique opportunity to flex creativity, only if you were up for the challenge.
-
-
Objectives Title
Objectives
-
Objectives Conclusion
Was your strategy being to go as fast as possible? Or did you want your power to be as low as possible, but mosey through the calculations? Or perhaps the middle of the road, kinda fast and kinda low power? Check the leaderboard up top to see how the designs compared.
-
Winning Criteria Recap
Winning Criteria Recap
The winning design was the one that met the accuracy, performance, and area criteria, and consumed the least average energy per inference. Participants used PowerPro from Siemens EDA to measure the power of their combined hardware and software system.
-
Prizes Awarded
All participants who completed the hackathon with a valid submission received a badge suitable for promoting their engineering genius on LinkedIn. If you made it to the top 3 have you posted it up on LinkedIn that you won, the High-Level Synthesis low-energy inferencing hackathon?
🥇The first-place winner got a Elegoo’s Neptune 3D printer, and an opportunity to speak at the Edge AI foundation’s Fall Taipei event (physically or virtually). Literally, and we literally mean this “literally,” fame and fortune.
🥈The second-place winner got a Pynq FPGA development board from Digilent, to hone their skills for next year’s competition. The Pynq board combines an ARM processor with Python and AI capabilities with FPGA fabric from AMD.
🥉The third-place winner can’t hear you because they’re enjoying their Bose QuietComfort Earbuds. They're likely lost in a blissful bubble of iconic audio, completely undisturbed by mere mortals thanks to that renowned noise cancellation. With a relentless, long-lasting battery powering their escape, all neatly packed into a compact, durable design, who can blame them? If you're feeling a sudden pang of 'earbud envy,' we totally get it!
Make sure to check HLS Academy often; you might be the one tuning out the world (and your colleagues) at our next hackathon. 😉