In this contributed article, Ellie Dobson, VP Product at Apheris, suggests that organisations know that data is often the most valuable asset that they own. But unlocking the full potential of these data assets is difficult in the face of privacy and security concerns. This is compounded by a lack of existing infrastructure that allows for collaboration across organisations.
AI Isn’t Screenwriters’ Enemy – In Fact, It Can be a Creative Superpower
In this contributed article, Ben Pines, director of content at AI21 Labs, discusses how generative AI can help screenwriters—not put them out of business, a concern that many have expressed and which is also among the reasons the Writers Guild of America went on strike.
What Can the Sports Betting Industry Teach About Building Predictive Models?
In this contributed article, Dr. Darryl Woodford PhD, CTO at Cipher Sports, shares his insights about the key features of a good sports betting prediction model. We’ve already seen how fast and accurate these models can be. In the wagering industry, the next frontier is to apply these recent advancements to in-play and micro-betting markets, requiring another step forward in both data processing and data acquisition.
Get Lit: 5 Steps to Building your Organization’s Data Literacy as you Prep for AI
In this contributed article, Christine Andrukonis, Workplace Transformation Expert and founder of Notion Consulting, believes that the ultimate goal of data literacy is to provide a framework for data-driven decision-making. Nothing is stopping you from developing a learning program that’s also fun, engaging, and beneficial to employees in all parts of their lives.
The Power of Startups in Turning Big Data into Big Impact
In this contributed article, Ben Younkman, Regional Director at Village Capital, explores how forward-thinking startups are spearheading data-driven innovation to create digital solutions that increase access to critical services — including the financial, educational and healthcare sectors — resulting in a more equitable and inclusive world.
Tips for Responsible Use of Generative AI in Enterprise
In this contributed article, Thor Philogéne, CEO and Founder of Stravito, discusses how the effective integration of generative AI in enterprise requires the identification of clear goals and pain points; reliable data and a human-centric design.
Three Roadblocks to Using Data to Its Full Potential
In this contributed article, Sridhar Bankuru, VP of Software Development at RightData, walks us through the top pain points of businesses today within their data trust journey, and looking ahead, how they can start trusting their data again.
What is a RAG?
In this contributed article, Magnus Revang, Chief Product Officer of Openstream.ai, points out that In the Large Language Model space, one acronym is frequently put forward as the solution to all the weaknesses. Hallucinations? RAG. Privacy? RAG. Confidentiality? RAG. Unfortunately, when asked to define RAG, the definitions are all over the place.
Happy Birthday ChatGPT!
Today, November 30, 2023, mark’s the first anniversary of OpenAI’s ChatGPT. In the last year, the AI chatbot has secured support from major Silicon Valley companies and seen integration across various fields including academia, the arts, marketing, medicine, gaming, and government. These are exciting times, so we decided to put together this round-up of commentaries from around the big data ecosystem. Enjoy!
Rethinking How Data is Stored and Processed Brings Scale and Speed to Modern Data-Intensive Applications
In this contributed article, Prasad Venkatachar, Sr Director – Products & Solutions at Pliops, discusses how modern data-intensive applications that include E-commerce, Social Networking, Messaging, and online gaming services heavily depend on Key-Value stores. All these business-critical applications demand state-of-the-art data storage and processing infrastructure to serve the data at high throughput with low latency and highly fault-tolerant and yet cost-effective. To achieve this blend of high performance and cost effectiveness, we must fundamentally reimagine how data is stored and processed at scale and speed. This article will cover how organizations can accomplish these design objectives and architect state-of-the-art data storage and processing infrastructure.