The Tech Panda takes a look at recent tech launches.
Confluent, Inc., the data streaming company, announced the Confluent OEM Program for managed service providers (MSPs), cloud service providers (CSPs), and independent software vendors (ISVs), which makes it easy to launch and enhance customer offerings with a complete data streaming platform for Apache Kafka® and Apache Flink®. With license to globally redistribute or embed Confluent’s enterprise-grade platform, partners can bring real-time products and Kafka offerings to market faster and easily monetize customer demand for data streaming with limited risk. The program makes data streaming a high-margin part of the business with expert implementation guidance and certification to help partners launch enterprise-ready offerings; flexible commercial terms that match the ways partners sell; and ongoing technical support to ensure long-term customer success.
“As data-driven technologies like GenAI become essential to enterprise operations, conversation has shifted from ‘if’ or ‘when’ a business will need data streaming to ‘what’s the fastest, most cost-effective way to get started?’” said Kamal Brar, Senior Vice President, Worldwide ISV and APAC, Confluent. “We help our partners unlock new revenue streams by meeting the growing demand for real-time data within every region they serve. Confluent offers the fastest route to delivering enterprise-grade data streaming, enabling partners to accelerate service delivery, reduce support costs, and minimize overall complexity and risk.”
The Why
The need for real-time data has cemented data streaming as a critical business requirement. According to ISG Software Research, “by 2026, more than three-quarters of enterprises’ standard information architectures will include streaming data and event processing.” To meet this need, teams often turn to popular open source technologies like Kafka and Flink. However, building and maintaining open source software, especially at scale, quickly becomes prohibitively expensive and time-consuming. On average, self-managing Kafka takes businesses more than two years to reach production scale, with ongoing platform development and operational costs exceeding millions of dollars per year. Over time, solutions built with open source Kafka and Flink consume more and more engineering resources, which impacts a business’ ability to focus on differentiation and maintain a competitive advantage.
The Confluent OEM Program alleviates the burdens of self-managing open source technologies while going far beyond just Kafka and Flink. MSPs and CSPs can easily deliver a complete data streaming platform through Confluent, providing a hassle-free solution for unlocking more customer projects across AI, real-time analytics, application modernization and more. ISVs can embed Confluent within their products or applications to cost-effectively power modern customer experiences fueled by real-time data. Confluent simplifies data streaming by eliminating the operational complexities of open source deployments, accelerating delivery times, and ensuring customer success through ongoing expert support. Secure, governed data streams can be available wherever needed—on premises, at the edge, and in the cloud.
Features
Rapyder Cloud Solutions, a company in cloud consulting and services, launched Rapyder Tech Studio, a state-of-the-art cutting-edge platform that allows customers to ‘Try & Buy’ Cloud and Generative AI solutions online. This innovative service allows customers to experiment in real-time, seamlessly book a POC and drive innovation faster and smarter. It also simplifies customer interactions by providing easy, efficient, and intuitive way to explore and acquire the solutions they need. With a comprehensive range of cloud and Generative AI products and services suited to the industries, Tech Studio empowers customers to make informed decisions in just a few clicks which will impact their businesses.
“We’re thrilled to launch Rapyder Tech Studio, a platform that empowers our customers to easily try and buy cutting-edge Cloud and Gen AI solutions. This initiative reflects our commitment to enhancing their cloud experience. With Tech Studio, customers can experiment in real-time, tailor solutions to their specific needs, and drive innovation to scale efficiently in today’s competitive market,” – Amit Gupta, Founder & CEO, Rapyder Cloud Solutions.
“Rapyder Tech Studio represents a significant leap for us in offering simplified industry-wise Cloud & GenAI solutions. With our try-and-buy portal, customers can now test, customize, and deploy scalable solutions on demand; making cloud adoption seamless and accelerating the deployment of AI-driven applications across industries,” – Athreya Ramadas, Co-founder & CTO, Rapyder Cloud Solutions.
Features
F5 (NASDAQ: FFIV) announced the general availability of F5 NGINX One, combining advanced load balancing, web and application server capabilities, API gateway functionalities, and security features in a dedicated package. Customers are now able to simply manage and secure F5 NGINX instances and NGINX Open Source from a single cloud management interface. End-to-end visibility speeds apps to market and enables advanced features like AI more efficiently versus a traditional siloed approach.
“Successful application deployment is a team sport,” said Shawn Wormke, Vice President and General Manager for NGINX at F5. “App delivery and security functions—and corresponding visibility—are often sequestered among individual groups. NGINX One is ideal for modern, ephemeral, and cloud-native app components such as containers and Kubernetes, providing a solution that cost-effectively optimizes, scales, and secures complicated application and API environments across multiple teams.”
The Why
Today’s application teams in the enterprise face the unprecedented difficulty of delivering apps across a wide variety of contexts—from high-performance “bare metal” servers to virtual machines and sprawling Kubernetes clusters, across data centers and in the public cloud. Applying uniform policies for security, compliance, and app delivery configuration has challenged these new widely distributed application architectures. For many organizations, maintaining hybrid and multicloud environments adds considerable operational overhead.
NGINX One improves app security and delivery for development, operations, and platform teams by making it easier to own, optimize, and govern NGINX components in any context. With the NGINX One Console, teams can broadly and easily enforce security policies across the application ecosystem, receive and implement configuration guidance, and automate version and patch updates—all helping to ensure compliance.
Features
Confluent, Inc., the data streaming company, introduced new capabilities to Confluent Cloud to make stream processing and data streaming more accessible and secure. Confluent’s new support of Table API makes Apache Flink® available to Java and Python developers; Confluent’s private networking for Flink provides enterprise-level protection for use cases with sensitive data; Confluent Extension for Visual Studio Code accelerates the development of real-time use cases; and Client-Side Field Level Encryption encrypts sensitive data for stronger security and privacy.
“The true strength of using Apache Flink for stream processing empowers developers to create applications that instantly analyze and respond to real-time data, significantly enhancing responsiveness and user experience,” said Stewart Bond, Research Vice President at IDC. “Managed Apache Flink solutions can eliminate the complexities of infrastructure management while saving time and resources. Businesses must look for a Flink solution that seamlessly integrates with the tools, programming languages, and data formats they’re already using for easy implementation into business workflows.”
The Why
More businesses are relying on stream processing to build real-time applications and pipelines for various use cases spanning machine learning, predictive maintenance, personalized recommendations and fraud detection. Stream processing lets organizations blend and enrich their data with information across their business. Apache Flink is the de facto standard for stream processing. However, many teams hit roadblocks with Flink because it’s operationally complex, difficult to secure, and has expensive infrastructure and management costs.
Features
DeVC led Felicity Games, a casual game developer and publisher, tied up with AbhiTech Games, a game developing studio, to launch and publish two games viz: Warbound and Laser Tanks. With this partnership AbhiTech Games will gain access to Felicity’s 1 million users while Felicity will aim to grow by 3x and almost double their monthly unique users by Q2 of 2025.
The studio will aim to scale the games with the help of ‘Pokhran’ a proprietary framework which rapidly prototypes and tests casual games in partnership with Indian game developers for commercial viability.
Anurag Choudhary, Founder & CEO of Felicity Games stated, “Through this partnership combining our expertise in casual game publishing with AbhiTech Games’s award-winning game development, we aim to not only expand our audience but also carry forward a shared vision. The innovative Pokhran framework will surely enhance our growth and create unforgettable experiences for players everywhere.”
Abhishek Singh Rana, Indie Developer and Founder at AbhiTechGames also stated that “The Pokhran framework will be a game-changer for us, helping us reach more players, achieve more downloads, and most importantly deliver memorable gaming experiences. Partnering with Felicity Games will allow us to take our games to the next level and grow alongside a company that shares our passion for innovation.”
Cloudflare, Inc. (NYSE: NET), the connectivity cloud company, announced AI Audit, a set of tools to help websites of any size analyze and control how their content is used by artificial intelligence (AI) models. For the first time, website and content creators will be able to quickly and easily understand how AI model providers are using their content, and then take control of whether and how the models are able to access it. Additionally, Cloudflare is developing a new feature where content creators can reliably set a fair price for their content that is used by AI companies for model training and retrieval augmented generation (RAG).
“AI will dramatically change content online, and we must all decide together what its future will look like,” said Matthew Prince, co-founder and CEO, Cloudflare. “Content creators and website owners of all sizes deserve to own and have control over their content. If they don’t, the quality of online information will deteriorate or be locked exclusively behind paywalls. With Cloudflare’s scale and global infrastructure, we believe we can provide the tools and set the standards to give websites, publishers, and content creators control and fair compensation for their contribution to the Internet, while still enabling AI model providers to innovate.”
The Why
Website owners, whether for-profit companies, media and news publications, or small personal sites, may be surprised to learn AI bots of all types are scanning their content thousands of times every day without the content creator knowing or being compensated, causing significant destruction of value for businesses large and small. Even when website owners are aware of how AI bots are using their content, they lack a sophisticated way to determine what scanning to allow and a simple way to take action. For society to continue to benefit from the depth and diversity of content on the Internet, content creators need the tools to take back control.
With AI Audit, Cloudflare aims to give content creators information and take back control so there can be a transparent exchange between the websites that want greater control over their content, and the AI model providers that are in need of fresh data sources, so that everyone benefits.
Features
Data Dynamics, an enterprise data management solution, expanded into India with the establishment of its Centre of Excellence in Pune, Maharashtra. In tandem, the company unveiled Zubin, a groundbreaking AI-powered self-service data management software set to redefine how organizations approach risk management, privacy, sovereignty, optimization, and sustainability. Zubin pioneers an industry-first DIY (Do It Yourself) approach to managing data that puts data ownership, control, and action directly into the hands of data creators.
“In a world consumed by AI use cases and implementation, providing transparency in data management is critical for establishing digital trust between enterprises and their customers,” said Piyush Mehta, CEO of Data Dynamics. “At Data Dynamics, we approach data with the highest level of respect—ensuring that every byte is managed responsibly and that ownership is returned to those who create it. Zubin embodies this philosophy, it fosters a culture of ownership and accountability, placing the power of data management directly in the hands of users. Zubin is THE strategic enabler of digital trust, data sovereignty, and data democracy, guiding organizations through the complexities of this AI age with confidence and clarity. The Pune Center of Excellence brings our data management solutions closer to the demographic that is generating data at a rapid pace, has already tabled a data protection policy, and is at the forefront of IT development and innovation.”
The Why
This launch comes at a critical juncture, perfectly aligned with the worldwide focus on data sovereignty, ethical AI, and data privacy, championing the core principles of digital trust and data democracy. As organizations grapple with the growing influence of AI, projected to add $15.7 trillion to the global economy by 2030, the demand for data privacy is reshaping the landscape, creating both opportunities and risks. Businesses now face the strategic challenge of balancing customer expectations, regulatory obligations, and the need for innovation. The challenge is intensified as 80% of enterprise data is unstructured and unmanaged, making data accuracy a monumental task for AI modeling, particularly given LLMs’ reliance on unstructured data.
Zubin addresses these challenges by empowering organizations to centralize data governance and decentralize data control, enabling central IT to set tailored data policies while simultaneously giving stakeholders at all levels.
IDC’s Spotlight Report on Rethinking Data Security further validates this approach, stating that “Self-service data management tools designed to integrate security capabilities and prioritize privacy and protection will be invaluable for organizations seeking to maximize data value without compromising security.” Click to read the full report.
Features
BT Group’s Digital Unit launched an innovative internal platform to help the company tap into the power of large language models (LLMs) from providers such as Anthropic, Meta, Claude, Cohere, and Amazon. The GenAI Gateway, built in collaboration with AWS and using Amazon Bedrock, Amazon SageMaker and AWS Professional Services capabilities, provides secure, private access to a range of natural-language processing and large language models, a critical tool BT Group will use as it embeds AI into the way it runs the business.
Fabio Cerone, GM EMEA Telco at AWS, said: “The BT Group GenAI Gateway is showcasing how enterprises can effectively deploy generative AI at scale and speed. It’s been a brilliant, pioneering opportunity to collaborate and work backwards from the customer to provide a way to accelerate deployment of generative AI use cases into production with embedded security and compliance. The GenAI Gateway will trigger the flywheel effect in the adoption of generative AI, delivering quicker results for BT Group and its customers.”
Deepika Adusumilli, Managing Director, Data & AI, BT Group’s Digital Unit said: “AI is helping us reimagine the future of our company. We believe that where our data is a constant, we need flexibility with our LLMs. GenAI Gateway allows us to tap into this powerful new set of technologies at scale, in a way that is safe, responsible, flexible and scalable, delivering the ambition we have for AI to unlock the human potential within BT Group, today and in the future.”
The Why
Ad-hoc use of LLMs, whilst appropriate for test and development work, is not well suited to large scale use; cost control, security and privacy need more careful management. LLM performance also needs to be monitored, for unexpected errors (e.g. “hallucinations”) and model decay over time (where LLMs stop behaving as expected).
Features
Bitget Wallet, a Web3 non-custodial wallet, launched OmniConnect, a software development kit that enables developers to seamlessly connect Telegram Mini-Apps to multichain ecosystems across over a multitude of blockchains including major networks like Solana, TON and all EVM-compatible chains. The integration allows Telegram Mini-Apps to utilize Bitget Wallet for signing and conducting transactions across multiple blockchain networks.
Alvin Kan, COO of Bitget Wallet, highlighted the importance of this development, stating, “Previously, Telegram Mini-Apps could only interact with the TON network, making it difficult to engage with other public chains. Bitget Wallet’s OmniConnect aims to bridge this gap, enabling seamless multi-chain interaction via Bitget Wallet. We’re excited for more developers and blockchain ecosystems to join us in building a more open and thriving Web3 environment on Telegram.”
Features
Boson Whitewater, a water utility company that converts STP-treated water into high-quality potable water, has collaborated with Biome Environmental Trust for India’s first indirect potable water reuse project through managed aquifer recharge at Devanahalli, Karnataka. The project produces 6,40,000 litres of potable drinking water per day, adhering to BIS-10500 drinking water standards. The clean water now directly benefits thousands of residents in the Devanahalli municipality.
Vishwanath S, Advisor, Biome Environmental Trust, said, “Devanahalli town relies heavily on deep borewells for its water supply. Through this project, we aim to revive the local lake, recharge groundwater, and explore how a town can become self-sufficient by using both local water sources and treated wastewater. The project has the capacity to meet Devanahalli’s 5.4 MLD (Million litres per day) water demand. In Phase 1, a water treatment plant was installed to provide 240 KL (Kilo litres) of water daily. In Phase 2, the project expanded with the addition of four more filter borewells, a reconstructed 60 KL sump, and a new 400 KLD water treatment plant. The system now delivers 640 KL of water daily, benefiting the Devanahalli residents”.
The Why
The project is part of a broader effort to rejuvenate 65 lakes in Bengaluru by using treated wastewater and rainwater. It involves reviving an old well and digging borewells to access the aquifer, along with the installation of water treatment plants in two phases. Now, the system provides 640 KL of water daily, helping supplement the domestic water requirement of Devanahalli town and its 45,000 residents. This was made possible through the collaboration of several organisations, including Carl Zeiss, Rotary South Parade Bangalore, and the Wipro Foundation.
Features
S8UL Esports, the Indian esports and gaming content organisation, won the ‘Mobile Organisation of the…
The Tech Panda takes a look at recent funding events in the tech ecosystem, seeking…
Colgate-Palmolive (India) Limited, the oral care brand, launched its Oral Health Movement. The AI-enabled initiative…
This fast-paced business world belongs to the forward thinking organisations that prioritise innovation and fully…
In the rapidly evolving financial technology landscape, innovative product studios are emerging as powerful catalysts…
In an era defined by rapid technological advancement, Artificial Intelligence (AI) stands as a transformative…