Harnessing Open-Source Technologies for Edge Analytics: TIBCO Scale Model

Open-Source technologies are quickly becoming an essential component in analyzing data, and “To the Edge and Beyond” covers insights on how edge computing is bringing data processing as close to the data source as possible. Host James Kent discusses the topic with Intel’s Brad Corrion, Director of Strategic Software Architecture, and TIBCO Software’s Jesús Centeno, Chief of Staff and Innovation Strategy for the office of the CTO. The two guests define the edge and how it compares to cloud computing.

“The main aspect is always contrasting with cloud computing,” Centeno added. “And how much more costly it is to send the data to the cloud, to do the processing, the inferencing, and get the insights that you need, and then send back those insights to the edge device to take action.” Centeno said people want instant access to that information, so they make decisions right away. But if edge computing is to catch up to the flexibility and scalability of cloud computing, tools are needed to make mass deployment a reality, and open-source projects are one way to help accelerate innovation at the edge.

Intel and TIBCO Software are both focusing on propelling open-source technology in new and exciting ways. One such example is Project AIR, an IOT platform for collecting, processing and visualizing IOT data. “Project AIR specifically focuses on edge computing. You can think of project AIR as your IoT platform that is going to help you connect the edge to the rest of your ecosystem.” Centeno said. Through the collaboration between Intel and TIBCO, the teams have created a whole suite of tools that allow developers to experiment, innovate and have access to hardware and data quicker.

Corrion says there’s one final hurdle for companies preparing to adopt Edge computing. “Company teams need to start playing with these technologies. That’s because when we talk about MLAPs, deep learning, collecting data, using containers, and using orchestration we are referring to Cloud practices that are brand new to this environment. Teams need to start playing with the tech, developing practices and muscle memory so they become natural for them. If a company doesn’t start now, that learning curve is going to hurt them later.”

Subscribe to this channel on Apple PodcastsSpotify, or Google Podcasts to hear more from the Intel Internet of Things Group.

To learn more about Project AIR please check out the below:

Recent Episodes

As AI infrastructure spreads beyond tech hubs and into America’s heartland, companies face a new imperative: not just to build facilities—but to build trust, local partnerships, and long-term value for the communities that host them. In Ellendale, North Dakota, Applied Digital’s Polaris Forge 1 campus has become a case study in what rural revitalization…

As demand for artificial intelligence continues to soar, the AI infrastructure needed to power it is scaling just as rapidly. A 2024 report from the International Data Corporation (IDC) forecasts that global spending on AI infrastructure will exceed $200 billion by 2028, driven by an explosion in compute-heavy applications like large language models and…

AI workloads are redefining the limits of data center design and infrastructure. Legacy data centers, built for traditional co-location, cannot handle the density, thermal demands, or power dynamics of accelerated computing. The AI boom has upended the data center sector, forcing a rapid shift to liquid-cooled racks as facilities pivot from sub-10kW racks to…