iTWire - Data Analytics & Artificial Intelligence iTWire - Technology News and Jobs Australia https://itwire.com/business-it-news/data.html 2024-09-12T18:24:20+10:00 Joomla! - Open Source Content Management The modern world of data and AI needs a new approach to data warehousing explains Teradata CEO 2024-09-09T22:45:09+10:00 2024-09-09T22:45:09+10:00 https://itwire.com/business-it-news/data/the-modern-world-of-data-and-ai-needs-a-new-approach-to-data-warehousing-explains-teradata-ceo.html David M Williams stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/79e2c1b57237e52c238007ff4219265d_S.jpg" alt="The modern world of data and AI needs a new approach to data warehousing explains Teradata CEO" /></div><div class="K2FeedIntroText"><p>Although Teradata has almost a half-century of on-prem data warehousing experience, it's a cloud-first business today and is bringing its considerable experience to help its customers modernise. Teradata CEO Steve McMillan spoke to <em>iTWire</em> about growth, AI, and the changing nature of data.</p> </div><div class="K2FeedFullText"> <p><a href="https://www.teradata.com/" target="_blank" rel="nofollow noopener">Teradata</a> has been in the business of data warehousing so long, the term "data warehouse" didn't even exist back then, some four plus decades ago. Today Teradata is still a massive data company, but when CEO <a href="https://www.linkedin.com/in/steve-mcmillan/" target="_blank" rel="nofollow noopener">Steve McMillan</a> took the helm in June 2020 he declard it would be a cloud-first business. That journey has paid dividends, with hundreds and hundreds of customers now part of its cloud business.</p> <p>Teradata sees half a billion dollar in annual recurring revenue (ARR) through its cloud arm, and is growing over 30% year on year.</p> <p>Major organisations are dependent on Teradata, and the cloud-first strategy "has given us the opportunity to demonstrate our Teradata platform can run mission critical workloads successfully in the cloud," McMillan said. "The success has built up a head of steam, and enabled us to demonstrate our maturity in this space."</p> <p>{loadposition david08}</p> <p>Australia is a significant part of the Teradata story. "I'm really proud of the Australian team," he said. "The majority of our customers in Australia are on a modernisation journey with Teradata."</p> <p>Now, you don't need to read too far in the news these days to find companies talking about cloud and modernisation. However, when it comes to Teradata these words are significant. Teradata customers are huge; the business is known for its on-premises strength since 1979. Finance, retail, telco, government, and other major industries live on Teradata with vast mountains of data. To help these organisations shift into hybrid operations with a cloud component is no mean feat, and is not without challenges of inertia. These are not overnight projects; these have complex rules around governance, providence, sovereignty, and more.</p> <p>Yet, Teradata has made it happen. "Over 70% of our customers are operating in a hybrid environment today," McMillan said. "Still a lot is on-premises, but data sets are moving or have moved to the cloud."</p> <p>What drives this is the desire to get the best out of that data. "Enterprise data warehousing has evolved over the last 30 to 40 years of taking data from all kinds of silos and putting it into one single store," he said. "But data has gravity."</p> <p>"We have a capability called <a href="https://www.teradata.com/platform/data-fabric" target="_blank" rel="nofollow noopener">QueryGrid</a> that enables customers to move a query to the data as opposed to moving all the data to the query. With QueryGrid we can have a Teradata ecosystem in Azure, in AWS, on-prem, and the QueryGrid enables you to send the query to the appropriate Teradata ecosystem sending back results without moving lots of data."</p> <p>This is the way of modern companies. "We've found customers have a whole range of workloads, some of which require highly advanced dedicated storage mechanisms with Teradata on top to run super-complex workloads."</p> <p>In fact, today's customers most likely "have 10x the data in native object stores than in structured data warehouses."</p> <p>A major factor pushing companies to consider their data strategies is the real power of artificial intelligence, as seen in machine learning and generative AI, both made viable by the sheer power of today's compute engines.</p> <p>"A Gartner study said over 90% of ANZ CIOs would have AI implementations in 2026," McMillan said. He's seeing huge growth in the Teradata platform being used from that AI perspective.</p> <p>In fact, back in August 2022 - "just before the ChatGPT craziness in November 2022" - Teradata launched new AI and ML capabilies named <a href="https://www.teradata.com/platform/clearscape-analytics" target="_blank" rel="nofollow noopener">ClearScape</a>. "We've been helping customers take advantage of AI, and now GenAI, through the Teradata platform. We've seen super-interesting answers driving good use cases from Teradata customers," he said.</p> <p>And over the last 12 to 18 months McMillan has definitely observed a maturing in AI projects. He's observed three important characteristics that measure if a project will be successful or not. This is valuable information from a global CEO, gleaned from the experiences of major organisations. These are:</p> <ol> <li>The project must ensure trust. "Make sure it's training with trusted data inside your ecosystem to have more assurity over the outputs," McMillan said. Even if you don't want to have generative AI talk directly to your customers, you can still have it augment data for your staff. "We're seeing human-centric GenAI solutions that empower people inside the organisation, giving advanced capabilities."</li> <li>The generative AI models must be ethical. "Trusted and ethical are super important," McMillan said. You must make sure no bias is introduced. "Some AIs are put in place with implicit bias in them. For example, against certain members of the population, either by economic stature or racial stature, or other. We make sure we can help customers with those ethical solutions. For example, you can have a capability to run A/B comparisons for advanced models."</li> <li>The project must be sustainable. It has to have some green credentials; there's no point having a super advanced AI that costs so much to run it's taking your business backwards. Here Teradata can bring all its expertise to bear. "We help implement AI that can be cost effective, and cost efficient, leveraging Teradata technology to optimise language models," McMillan said.</li> </ol> <p>These are the three factors that successful AI projects have in common. However, you might still be wondering what would be a good use case for AI in your company, in the first place. You know AI is the key to getting ahead, to unlocking the hidden value of your data to speed up customer service, to accelerate time to market, to create content quickly, and more. But what does this mean in a practical sense, in your specific situation? Again, McMillan has three observations from his experience. "These are the three horizons in terms of customers utilising AI," he said.</p> <ol> <li>Improve the efficiency and effectiveness of people inside your company. This might be achieved by using commercially available AI features in tools you already use, such as the various Copilots being released.</li> <li>Embed AI, such as generative AI, into your product or services. In this way you help your customers gain efficiency and effectiveness.</li> <li>Identify a way to use AI to transform your industry. An example McMillan provides is that of Unilever; the chief data officer for Unilever told him how merchaniders would speak with buyers like Walmart to negotiate over the price of a bar of soap, for example. Now Unilever has developed advanced models in terms of demand and supply to help augment the merchandiser with information, while Walmart has done the same for its buyers. "It's like how high-frequency trading transformed the stock exchange. It moves to a battle of models. We find that really interesting," he said.</li> </ol> <p>McMillan adds, "our approach in Teradata is not to lock a company into one kind of model. We believe in the future companies won't just use large language models with trillions of parameters, but will use small and medium language models with much smaller parameters but far more specialised in terms of results that come out."</p> <p>"This will ensure better data quality, minimised hallucinations, bias removed, and better output sets."</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/79e2c1b57237e52c238007ff4219265d_S.jpg" alt="The modern world of data and AI needs a new approach to data warehousing explains Teradata CEO" /></div><div class="K2FeedIntroText"><p>Although Teradata has almost a half-century of on-prem data warehousing experience, it's a cloud-first business today and is bringing its considerable experience to help its customers modernise. Teradata CEO Steve McMillan spoke to <em>iTWire</em> about growth, AI, and the changing nature of data.</p> </div><div class="K2FeedFullText"> <p><a href="https://www.teradata.com/" target="_blank" rel="nofollow noopener">Teradata</a> has been in the business of data warehousing so long, the term "data warehouse" didn't even exist back then, some four plus decades ago. Today Teradata is still a massive data company, but when CEO <a href="https://www.linkedin.com/in/steve-mcmillan/" target="_blank" rel="nofollow noopener">Steve McMillan</a> took the helm in June 2020 he declard it would be a cloud-first business. That journey has paid dividends, with hundreds and hundreds of customers now part of its cloud business.</p> <p>Teradata sees half a billion dollar in annual recurring revenue (ARR) through its cloud arm, and is growing over 30% year on year.</p> <p>Major organisations are dependent on Teradata, and the cloud-first strategy "has given us the opportunity to demonstrate our Teradata platform can run mission critical workloads successfully in the cloud," McMillan said. "The success has built up a head of steam, and enabled us to demonstrate our maturity in this space."</p> <p>{loadposition david08}</p> <p>Australia is a significant part of the Teradata story. "I'm really proud of the Australian team," he said. "The majority of our customers in Australia are on a modernisation journey with Teradata."</p> <p>Now, you don't need to read too far in the news these days to find companies talking about cloud and modernisation. However, when it comes to Teradata these words are significant. Teradata customers are huge; the business is known for its on-premises strength since 1979. Finance, retail, telco, government, and other major industries live on Teradata with vast mountains of data. To help these organisations shift into hybrid operations with a cloud component is no mean feat, and is not without challenges of inertia. These are not overnight projects; these have complex rules around governance, providence, sovereignty, and more.</p> <p>Yet, Teradata has made it happen. "Over 70% of our customers are operating in a hybrid environment today," McMillan said. "Still a lot is on-premises, but data sets are moving or have moved to the cloud."</p> <p>What drives this is the desire to get the best out of that data. "Enterprise data warehousing has evolved over the last 30 to 40 years of taking data from all kinds of silos and putting it into one single store," he said. "But data has gravity."</p> <p>"We have a capability called <a href="https://www.teradata.com/platform/data-fabric" target="_blank" rel="nofollow noopener">QueryGrid</a> that enables customers to move a query to the data as opposed to moving all the data to the query. With QueryGrid we can have a Teradata ecosystem in Azure, in AWS, on-prem, and the QueryGrid enables you to send the query to the appropriate Teradata ecosystem sending back results without moving lots of data."</p> <p>This is the way of modern companies. "We've found customers have a whole range of workloads, some of which require highly advanced dedicated storage mechanisms with Teradata on top to run super-complex workloads."</p> <p>In fact, today's customers most likely "have 10x the data in native object stores than in structured data warehouses."</p> <p>A major factor pushing companies to consider their data strategies is the real power of artificial intelligence, as seen in machine learning and generative AI, both made viable by the sheer power of today's compute engines.</p> <p>"A Gartner study said over 90% of ANZ CIOs would have AI implementations in 2026," McMillan said. He's seeing huge growth in the Teradata platform being used from that AI perspective.</p> <p>In fact, back in August 2022 - "just before the ChatGPT craziness in November 2022" - Teradata launched new AI and ML capabilies named <a href="https://www.teradata.com/platform/clearscape-analytics" target="_blank" rel="nofollow noopener">ClearScape</a>. "We've been helping customers take advantage of AI, and now GenAI, through the Teradata platform. We've seen super-interesting answers driving good use cases from Teradata customers," he said.</p> <p>And over the last 12 to 18 months McMillan has definitely observed a maturing in AI projects. He's observed three important characteristics that measure if a project will be successful or not. This is valuable information from a global CEO, gleaned from the experiences of major organisations. These are:</p> <ol> <li>The project must ensure trust. "Make sure it's training with trusted data inside your ecosystem to have more assurity over the outputs," McMillan said. Even if you don't want to have generative AI talk directly to your customers, you can still have it augment data for your staff. "We're seeing human-centric GenAI solutions that empower people inside the organisation, giving advanced capabilities."</li> <li>The generative AI models must be ethical. "Trusted and ethical are super important," McMillan said. You must make sure no bias is introduced. "Some AIs are put in place with implicit bias in them. For example, against certain members of the population, either by economic stature or racial stature, or other. We make sure we can help customers with those ethical solutions. For example, you can have a capability to run A/B comparisons for advanced models."</li> <li>The project must be sustainable. It has to have some green credentials; there's no point having a super advanced AI that costs so much to run it's taking your business backwards. Here Teradata can bring all its expertise to bear. "We help implement AI that can be cost effective, and cost efficient, leveraging Teradata technology to optimise language models," McMillan said.</li> </ol> <p>These are the three factors that successful AI projects have in common. However, you might still be wondering what would be a good use case for AI in your company, in the first place. You know AI is the key to getting ahead, to unlocking the hidden value of your data to speed up customer service, to accelerate time to market, to create content quickly, and more. But what does this mean in a practical sense, in your specific situation? Again, McMillan has three observations from his experience. "These are the three horizons in terms of customers utilising AI," he said.</p> <ol> <li>Improve the efficiency and effectiveness of people inside your company. This might be achieved by using commercially available AI features in tools you already use, such as the various Copilots being released.</li> <li>Embed AI, such as generative AI, into your product or services. In this way you help your customers gain efficiency and effectiveness.</li> <li>Identify a way to use AI to transform your industry. An example McMillan provides is that of Unilever; the chief data officer for Unilever told him how merchaniders would speak with buyers like Walmart to negotiate over the price of a bar of soap, for example. Now Unilever has developed advanced models in terms of demand and supply to help augment the merchandiser with information, while Walmart has done the same for its buyers. "It's like how high-frequency trading transformed the stock exchange. It moves to a battle of models. We find that really interesting," he said.</li> </ol> <p>McMillan adds, "our approach in Teradata is not to lock a company into one kind of model. We believe in the future companies won't just use large language models with trillions of parameters, but will use small and medium language models with much smaller parameters but far more specialised in terms of results that come out."</p> <p>"This will ensure better data quality, minimised hallucinations, bias removed, and better output sets."</p></div> Queensland Government taps Databricks for new Data and AI Academy 2024-08-28T11:11:49+10:00 2024-08-28T11:11:49+10:00 https://itwire.com/business-it-news/data/queensland-government-taps-databricks-for-new-data-and-ai-academy.html Gordon Peters stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/7959602df5dac1d9ad90c0c0709b1eb6_S.jpg" alt="Adam Beavis, Vice President and General Manager for Databricks in ANZ. " /></div><div class="K2FeedIntroText"><p>Data and AI company, Databricks has launched a Data and AI Academy for Queensland State Government agencies, designed to expedite data and AI training and enhance the capabilities of Australian public sector staff.</p> </div><div class="K2FeedFullText"> <p>DataBricks says that upon completing the courses, participants will be able to obtain the necessary Databricks accreditations and certifications.</p> <p>“Databricks Data and AI Academy will upskill more than 100 Queensland Government staff on data, analytics, and AI, helping them to become more proficient with innovative technology to improve productivity and achieve better results,” notes Databricks.</p> <p>“The strategic programme will also allow them to then train their colleagues, empowering public sector staff to better leverage the suite of data and AI capabilities at their disposal through the Databricks Platform.”</p> <p>{loadposition peter}</p> <p>“Our organisation’s digital transformation is well underway, yet a recent review of our data and AI capabilities revealed a skills gap when it came to deploying certain functionalities. Databricks' leading Data and AI Academy offered us the opportunity to empower our staff and ensure skills uplift within our workforce, owing to the programme's inherent knowledge sharing feature,” said <em><strong>Peter How, General Manager, Innovation &amp; Delivery, National Injury Insurance Scheme, Queensland.</strong></em></p> <p>“The learnings from the program will inform and fuel our organisation's capabilities for years to come, enabling our teams to co-design and provide improved services for our participants, external stakeholders and internal staff”.</p> <p>How notes that the Data and AI Academy also promotes citizen data capabilities by automating complex preprocessing, engineering, and model training processes, enabling users to easily build, train, and deploy their own models through a low-code approach.</p> <p>“Key to our department’s data strategy is the data and digital literacy of our staff, in a rapidly changing environment. The Data and AI Academy is just what we have been looking for as we face skills gaps across the department in leveraging data and AI processing capabilities and mounting implementations,” said <em><strong>Damon Atzeni, Director of Data and Analytics, Queensland Health.</strong></em></p> <p>”The Academy is our fast track to understanding the power of the solution we have and developing the skills to use it effectively. The program has enabled us to self-manage deployments in reforming our health data, resulting in greater administrative efficiency and patient care insights.”</p> <p>“Between Australia’s tight talent market and the quick rise and rapid evolution of AI capabilities, many organisations understand they must double down on training efforts to run best-in-class operations,” said<em><strong> Adam Beavis, Vice President and General Manager for Databricks in ANZ.</strong></em></p> <p>“Our leading data and AI academy programme uniquely positions us to assist and guide institutions like the NIISQ and Queensland Health in upskilling staff to accelerate the development of different use cases and their implementation to further innovation.”</p> <p>National Injury Insurance Scheme, Queensland and Queensland Health’s adoption of Databricks Data and AI Academy procedures follow Databricks audit of the essential state government departments’ data strategies and need for refined implementation strategies.</p> <p>Databricks hopes to expand the programme to additional departments throughout the Queensland Government.</p> <p>About Databricks<br />Databricks is the Data and AI company. More than 10,000 organisations worldwide — including Block, Comcast, Condé Nast, Rivian, Shell and over 60% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to take control of their data and put it to work with AI. Databricks is headquartered in San Francisco, with offices around the globe, and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on LinkedIn, X and Facebook.</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/7959602df5dac1d9ad90c0c0709b1eb6_S.jpg" alt="Adam Beavis, Vice President and General Manager for Databricks in ANZ. " /></div><div class="K2FeedIntroText"><p>Data and AI company, Databricks has launched a Data and AI Academy for Queensland State Government agencies, designed to expedite data and AI training and enhance the capabilities of Australian public sector staff.</p> </div><div class="K2FeedFullText"> <p>DataBricks says that upon completing the courses, participants will be able to obtain the necessary Databricks accreditations and certifications.</p> <p>“Databricks Data and AI Academy will upskill more than 100 Queensland Government staff on data, analytics, and AI, helping them to become more proficient with innovative technology to improve productivity and achieve better results,” notes Databricks.</p> <p>“The strategic programme will also allow them to then train their colleagues, empowering public sector staff to better leverage the suite of data and AI capabilities at their disposal through the Databricks Platform.”</p> <p>{loadposition peter}</p> <p>“Our organisation’s digital transformation is well underway, yet a recent review of our data and AI capabilities revealed a skills gap when it came to deploying certain functionalities. Databricks' leading Data and AI Academy offered us the opportunity to empower our staff and ensure skills uplift within our workforce, owing to the programme's inherent knowledge sharing feature,” said <em><strong>Peter How, General Manager, Innovation &amp; Delivery, National Injury Insurance Scheme, Queensland.</strong></em></p> <p>“The learnings from the program will inform and fuel our organisation's capabilities for years to come, enabling our teams to co-design and provide improved services for our participants, external stakeholders and internal staff”.</p> <p>How notes that the Data and AI Academy also promotes citizen data capabilities by automating complex preprocessing, engineering, and model training processes, enabling users to easily build, train, and deploy their own models through a low-code approach.</p> <p>“Key to our department’s data strategy is the data and digital literacy of our staff, in a rapidly changing environment. The Data and AI Academy is just what we have been looking for as we face skills gaps across the department in leveraging data and AI processing capabilities and mounting implementations,” said <em><strong>Damon Atzeni, Director of Data and Analytics, Queensland Health.</strong></em></p> <p>”The Academy is our fast track to understanding the power of the solution we have and developing the skills to use it effectively. The program has enabled us to self-manage deployments in reforming our health data, resulting in greater administrative efficiency and patient care insights.”</p> <p>“Between Australia’s tight talent market and the quick rise and rapid evolution of AI capabilities, many organisations understand they must double down on training efforts to run best-in-class operations,” said<em><strong> Adam Beavis, Vice President and General Manager for Databricks in ANZ.</strong></em></p> <p>“Our leading data and AI academy programme uniquely positions us to assist and guide institutions like the NIISQ and Queensland Health in upskilling staff to accelerate the development of different use cases and their implementation to further innovation.”</p> <p>National Injury Insurance Scheme, Queensland and Queensland Health’s adoption of Databricks Data and AI Academy procedures follow Databricks audit of the essential state government departments’ data strategies and need for refined implementation strategies.</p> <p>Databricks hopes to expand the programme to additional departments throughout the Queensland Government.</p> <p>About Databricks<br />Databricks is the Data and AI company. More than 10,000 organisations worldwide — including Block, Comcast, Condé Nast, Rivian, Shell and over 60% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to take control of their data and put it to work with AI. Databricks is headquartered in San Francisco, with offices around the globe, and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on LinkedIn, X and Facebook.</p></div> Cloudian, Lenovo unveil AI data lake platform 2024-08-23T08:24:55+10:00 2024-08-23T08:24:55+10:00 https://itwire.com/business-it-news/data/cloudian,-lenovo-unveil-ai-data-lake-platform.html Kenn Anthony Mendoza stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/a9279a977f73579a76f623b29f2e0d4e_S.jpg" alt="Cloudian managing director Asia Pacific and Japan James Wright" /></div><div class="K2FeedIntroText"><p>Software company Cloudian partnered with Lenovo to unveil the Cloudian HyperStore AI data lake platform.</p> </div><div class="K2FeedFullText"> <p>Built on Lenovo ThinkSystem SR635 V3 all-flash servers with AMD EPYC 9454P processors, the new solution demonstrated performance of 28.7 GB/s reads and 18.4 GB/s writes from a cluster of six power-efficient, single-processor servers, delivering a 74% power efficiency improvement versus a HDD-based system in Cloudian testing.</p> <p>Lenovo combines Cloudian’s AI-ready data platform software with its all-flash Lenovo ThinkSystem SR635 V3 servers and 4th Gen AMD EPYC processors to deliver data management solution for AI and data analytics.</p> <p>“There’s a big focus on the AI boom in Australia, New Zealand and across APAC, and it’s easy to see why when bodies like the CSIRO say the Australian market alone could be worth close to $500 billion in the next few years,” says Cloudian managing director Asia Pacific and Japan James Wright (pictured).</p> <p>{loadposition kenn}</p> <p>“But there’s a storage and infrastructure layer that companies and government agencies need to power the data-hungry workloads central to AI’s performance and functionality. What’s out there now simply won’t cut it. Imagine trying to power the mobile applications we use today with the simple mobile phones we had 20 years ago – it wouldn’t work and it’s no different at the infrastructure level, particularly with AI in play.”</p> <p>“For organisations looking to innovate or drive research and discovery with AI, ML, and HPC, this solution promises to be transformative,” says Cloudian CEO and co-founder Michael Tso.</p> <p>Built for mission-critical, capacity-intensive workloads, the platform features exabyte scalability, S3 API compatibility, military-grade security, and Object Lock for ransomware protection.</p> <p>“Combining Lenovo’s high-performance all-flash AMD EPYC CPU-based servers with Cloudian's AI data lake software creates a solution that can handle the most demanding AI and analytics workloads,” says Lenovo general manager Stuart McRae.</p> <p>“This partnership enables us to offer our customers a cutting-edge, scalable, and secure platform that will help them accelerate their AI initiatives and drive innovation.”</p> <p>“AI workloads demand a lot from storage. Our 4th Gen AMD EPYC processors together with Lenovo's ThinkSystem servers and Cloudian's AI data lake software deliver the performance and scalability that AI users need,” says AMD corporate vice president strategic business development Kumaran Siva. “The single socket, AMD EPYC CPU-based Lenovo ThinkSystem SR635 V3 platform provides outstanding throughput combined with excellent power and rack efficiency to accelerate AI innovation.”</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/a9279a977f73579a76f623b29f2e0d4e_S.jpg" alt="Cloudian managing director Asia Pacific and Japan James Wright" /></div><div class="K2FeedIntroText"><p>Software company Cloudian partnered with Lenovo to unveil the Cloudian HyperStore AI data lake platform.</p> </div><div class="K2FeedFullText"> <p>Built on Lenovo ThinkSystem SR635 V3 all-flash servers with AMD EPYC 9454P processors, the new solution demonstrated performance of 28.7 GB/s reads and 18.4 GB/s writes from a cluster of six power-efficient, single-processor servers, delivering a 74% power efficiency improvement versus a HDD-based system in Cloudian testing.</p> <p>Lenovo combines Cloudian’s AI-ready data platform software with its all-flash Lenovo ThinkSystem SR635 V3 servers and 4th Gen AMD EPYC processors to deliver data management solution for AI and data analytics.</p> <p>“There’s a big focus on the AI boom in Australia, New Zealand and across APAC, and it’s easy to see why when bodies like the CSIRO say the Australian market alone could be worth close to $500 billion in the next few years,” says Cloudian managing director Asia Pacific and Japan James Wright (pictured).</p> <p>{loadposition kenn}</p> <p>“But there’s a storage and infrastructure layer that companies and government agencies need to power the data-hungry workloads central to AI’s performance and functionality. What’s out there now simply won’t cut it. Imagine trying to power the mobile applications we use today with the simple mobile phones we had 20 years ago – it wouldn’t work and it’s no different at the infrastructure level, particularly with AI in play.”</p> <p>“For organisations looking to innovate or drive research and discovery with AI, ML, and HPC, this solution promises to be transformative,” says Cloudian CEO and co-founder Michael Tso.</p> <p>Built for mission-critical, capacity-intensive workloads, the platform features exabyte scalability, S3 API compatibility, military-grade security, and Object Lock for ransomware protection.</p> <p>“Combining Lenovo’s high-performance all-flash AMD EPYC CPU-based servers with Cloudian's AI data lake software creates a solution that can handle the most demanding AI and analytics workloads,” says Lenovo general manager Stuart McRae.</p> <p>“This partnership enables us to offer our customers a cutting-edge, scalable, and secure platform that will help them accelerate their AI initiatives and drive innovation.”</p> <p>“AI workloads demand a lot from storage. Our 4th Gen AMD EPYC processors together with Lenovo's ThinkSystem servers and Cloudian's AI data lake software deliver the performance and scalability that AI users need,” says AMD corporate vice president strategic business development Kumaran Siva. “The single socket, AMD EPYC CPU-based Lenovo ThinkSystem SR635 V3 platform provides outstanding throughput combined with excellent power and rack efficiency to accelerate AI innovation.”</p></div> Want GenAI? Then you need a document database for the best result says MongoDB exec 2024-08-21T23:42:18+10:00 2024-08-21T23:42:18+10:00 https://itwire.com/business-it-news/data/want-genai-then-you-need-a-document-database-for-the-best-result-says-mongodb-exec.html David M Williams stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/0544196a811ae08e3b8b24aca7593b46_S.jpg" alt="Want GenAI? Then you need a document database for the best result says MongoDB exec" /></div><div class="K2FeedIntroText"><p>Generative AI is bringing vast business benefits from summarising documents, to helping with customer service, even aiding organisations in asking questions of complex systems in simple plain language. However, you might not be using the right tools for the job. MongoDB field CTO Rick Houlihan experimented and found vast performance gains when using a document database over a relational one.</p> </div><div class="K2FeedFullText"> <p>In the world of databases, <a href="https://mongodb.com" target="_blank" rel="nofollow noopener">MongoDB</a> is a leader in the 'no SQL' movement. Relational databases trace their roots to mathematical set theory, and its rules of relational algebra were set in stone by IBM and other researchers since the 1950s. Relational databases power sales, payrolls, inventory, flight schedules, and all kinds of enterprise purposes around the world.</p> <p>Yet, in a modern world where text - and specifically, natural language - is becoming a major force, the relational database may simply not be the right choice.</p> <p>"Third normal form is a great mathematical and logical representation of data," said MongoDB field CTO <a href="https://www.linkedin.com/in/rickhoulihan/" target="_blank" rel="nofollow noopener">Rick Houlihan</a>, referring to the relational model of using tables with linked fields to ensure a single set of master data without redundancy. "But it has a high time complexity to map data together."</p> <p>{loadposition david08}</p> <p>"MongoDB makes it easier to work with data. Our core database, the document database, was built to remove abstractions from data. A document database brings data out in a better way," he said. "We don't have to work with data in third normal form in apps."</p> <p>In fact, the challenge of developers getting their heads around relational database mechanisms can often be a complex one. It's why, Houlihan notes, there's a whole slew of object relational-mapping (ORM) tools. One such example is the popular Entity Framework for .NET; such tools are used by developers to remove abstractions from relational databases. Houlihan says you simply don't have that fuss in a document database; it just works differently.</p> <p>"We say just store the data how you use it. Make document structures that map to your access patterns," he said. "It's more efficient."</p> <p>And, when it comes to generative AI - which institutions all around the world are working hard to find and pilot use cases for - the choice of database can make a huge difference.</p> <p>Houlihan is more than willing to put his money where his mouth is. "I've always been a big fan of <a href="https://en.wikipedia.org/wiki/Grace_Hopper" target="_blank" rel="nofollow noopener">Grace Hopper</a> who <a href="https://mathshistory.st-andrews.ac.uk/Biographies/Hopper/quotations/" target="_blank" rel="nofollow noopener">said</a> 'one accurate measurement is worth a thousand expert opinions'" he said. Thus, earlier this year he <a href="https://www.linkedin.com/pulse/comparing-document-data-options-generative-ai-rick-houlihan-pnf5e/" target="_blank" rel="nofollow noopener">tested for himself</a> how well different databases could support generative AI with truly eye-opening results.</p> <p>Using the exact same hardware, and with Postgres and MongoDB set up, with clearly stated configurations and parameters, Houlihan loaded single attributes and multiple attributes of increasing size into the databases. This replicates the type of data generative AI deals with; it's not about simple numeric order IDs or product SKUs or surnames and first names. Rather, GenAI is all about huge chunks of text - contracts, manuals, documentation. Even if the text is chunked it's still in blocks of 4Kb or more. It's a scenario that a document database excels at, and a relational database does not.</p> <p>Houlihan's testing showed for small block sizes MongoDB and Postgres compared relatively evenly until the payload size ramped up. No matter if using Postgres <a href="https://www.mongodb.com/resources/basics/json-and-bson">JSON</a> (a widely used data interchange format popular across many applications and technology stacks) or JSONB after a mere 200 bytes their processing time to insert data began increasing significantly. Meanwhile, MongoDB retained a reasonably linear insert time irrespective of the size of the data.</p> <p>For example, to insert 200 attributes at 4000 bytes Postgres took 37.2s using JSONB and 17.5s using JSON, while MongoDB did the same work at fractionally over a mere second.</p> <p>The read workload running against the same data took 53.8s and 27.8s in Postgres for JSONB and JSON respectively, against 8.4s in MongoDB.</p> <p>Data types like JSON or VARIANT can help to shoe-horn large objects into relational databases, but the takeaway from Houlihan's experiments is clear. Relational databases suffer from performance limitations of wide rows and large data attributes, while a document database such as MongoDB takes them in its stride.</p> <p>Relational databases are valuable, and important, of course, but Houlihan's message is you need to use the right tool for the right task. And, when it comes to generative AI, the right tool is a document database like MongoDB.</p> <p>Houlian published his work in a <a href="https://github.com/rhoulihan/BSON-JSON-bakeoff" target="_blank" rel="nofollow noopener">GitHub repo</a> for the world to see and for anyone to validate for themselves.</p> <p>It's not simply an experiment; when it comes to MongoDB and GenAI "there are companies doing real things, with real impact," Houlihan said. He cites an example as <a href="https://pathfinderlabs.nz/" target="_blank" rel="nofollow noopener">Pathfinder</a>, an organisation which uncovers cybercrime evidence, collates it, and then uses AI to find similarities and identify perpetrators of such evil as human trafficking and exploitation.</p> <p>Another is <a href="https://www.novonordisk.com/" target="_blank" rel="nofollow noopener">Nova Nordisk</a> leveraging MongoDB and Generative AI to improve health care and advance medical treatment for common diseases like diabetes and cancer. Reducing the time required to compile clinical research reports for regulatory approval of new pharmaceuticals from 12 weeks to just 10 minutes has empowered their business to do more with less.</p> <p>Meanwhile, MongoDB has a big Australian connection, Houlihan explained. While MongoDB is available as a free product, it also comes as a managed service called <a href="https://www.mongodb.com/products/platform/atlas-database" target="_blank" rel="nofollow noopener">Atlas</a>. A new feature, <a href="https://www.mongodb.com/docs/charts/" target="_blank" rel="nofollow noopener">Atlas Charts</a>, uses natural language to easily visualise data and help end users self-service to get meaningful, actionable insights into their data without having to wait upon specialised BI developers being freed up.</p> <p>Atlas Charts is the work of MongoDB's Sydney-based engineering team. This team has also been a big part of the company's <a href="https://www.mongodb.com/products/tools/relational-migrator" target="_blank" rel="nofollow noopener">Relational Migrator</a>, a service that uses generative AI to help organisations migrate relational databases to MongoDB, among many other projects.</p> <p>Houlihan previously spent time as the first technical product manager for <a href="https://aws.amazon.com/documentdb/" target="_blank" rel="nofollow noopener">AWS DocumentDB</a>, building a NoSQL centre of excellence there. Now he is at MongoDB and taking his big ideas further.</p> <p>What attracted him to MongoDB was how the product "wraps functionality behind a unified API where devs don't have to learn five or six different tech stacks. We don't reinvent the wheel; we invest in the core service and then add the best-of-breed from the industry into the product." An example is <a href="https://lucene.apache.org/" target="_blank" rel="nofollow noopener">Lucene</a>; it's the most popular full-text search engine, and the same one that backs Elastic, among other popular products. "So, we built Lucene into our own API to reduce developer overhead in working with the data."</p> <p>"On top of that," he said, "our founders had geographical data distribution in mind from the beginning. The relational database is not built with geographic distribution in mind. Layers can be added on top like log shipping for Postgres, or Golden Gate for Oracle, but it's often left for the developer to solve."</p> <p>By contrast, "it's a first-class citizen in MongoDB. We combine the flexibility of the document model with the ability to determine on each individual write how the data should be replicated and what level of consistency is required. It's a really novel way of working with data and how to store and access it."</p> <p>"We work with the largest financial institutions in the world. These companies run extremely high-velocity trading and payment processing applications; the tech we provide for those kinds of workloads drives an enormous amount of efficiency," Houlihan said.</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/0544196a811ae08e3b8b24aca7593b46_S.jpg" alt="Want GenAI? Then you need a document database for the best result says MongoDB exec" /></div><div class="K2FeedIntroText"><p>Generative AI is bringing vast business benefits from summarising documents, to helping with customer service, even aiding organisations in asking questions of complex systems in simple plain language. However, you might not be using the right tools for the job. MongoDB field CTO Rick Houlihan experimented and found vast performance gains when using a document database over a relational one.</p> </div><div class="K2FeedFullText"> <p>In the world of databases, <a href="https://mongodb.com" target="_blank" rel="nofollow noopener">MongoDB</a> is a leader in the 'no SQL' movement. Relational databases trace their roots to mathematical set theory, and its rules of relational algebra were set in stone by IBM and other researchers since the 1950s. Relational databases power sales, payrolls, inventory, flight schedules, and all kinds of enterprise purposes around the world.</p> <p>Yet, in a modern world where text - and specifically, natural language - is becoming a major force, the relational database may simply not be the right choice.</p> <p>"Third normal form is a great mathematical and logical representation of data," said MongoDB field CTO <a href="https://www.linkedin.com/in/rickhoulihan/" target="_blank" rel="nofollow noopener">Rick Houlihan</a>, referring to the relational model of using tables with linked fields to ensure a single set of master data without redundancy. "But it has a high time complexity to map data together."</p> <p>{loadposition david08}</p> <p>"MongoDB makes it easier to work with data. Our core database, the document database, was built to remove abstractions from data. A document database brings data out in a better way," he said. "We don't have to work with data in third normal form in apps."</p> <p>In fact, the challenge of developers getting their heads around relational database mechanisms can often be a complex one. It's why, Houlihan notes, there's a whole slew of object relational-mapping (ORM) tools. One such example is the popular Entity Framework for .NET; such tools are used by developers to remove abstractions from relational databases. Houlihan says you simply don't have that fuss in a document database; it just works differently.</p> <p>"We say just store the data how you use it. Make document structures that map to your access patterns," he said. "It's more efficient."</p> <p>And, when it comes to generative AI - which institutions all around the world are working hard to find and pilot use cases for - the choice of database can make a huge difference.</p> <p>Houlihan is more than willing to put his money where his mouth is. "I've always been a big fan of <a href="https://en.wikipedia.org/wiki/Grace_Hopper" target="_blank" rel="nofollow noopener">Grace Hopper</a> who <a href="https://mathshistory.st-andrews.ac.uk/Biographies/Hopper/quotations/" target="_blank" rel="nofollow noopener">said</a> 'one accurate measurement is worth a thousand expert opinions'" he said. Thus, earlier this year he <a href="https://www.linkedin.com/pulse/comparing-document-data-options-generative-ai-rick-houlihan-pnf5e/" target="_blank" rel="nofollow noopener">tested for himself</a> how well different databases could support generative AI with truly eye-opening results.</p> <p>Using the exact same hardware, and with Postgres and MongoDB set up, with clearly stated configurations and parameters, Houlihan loaded single attributes and multiple attributes of increasing size into the databases. This replicates the type of data generative AI deals with; it's not about simple numeric order IDs or product SKUs or surnames and first names. Rather, GenAI is all about huge chunks of text - contracts, manuals, documentation. Even if the text is chunked it's still in blocks of 4Kb or more. It's a scenario that a document database excels at, and a relational database does not.</p> <p>Houlihan's testing showed for small block sizes MongoDB and Postgres compared relatively evenly until the payload size ramped up. No matter if using Postgres <a href="https://www.mongodb.com/resources/basics/json-and-bson">JSON</a> (a widely used data interchange format popular across many applications and technology stacks) or JSONB after a mere 200 bytes their processing time to insert data began increasing significantly. Meanwhile, MongoDB retained a reasonably linear insert time irrespective of the size of the data.</p> <p>For example, to insert 200 attributes at 4000 bytes Postgres took 37.2s using JSONB and 17.5s using JSON, while MongoDB did the same work at fractionally over a mere second.</p> <p>The read workload running against the same data took 53.8s and 27.8s in Postgres for JSONB and JSON respectively, against 8.4s in MongoDB.</p> <p>Data types like JSON or VARIANT can help to shoe-horn large objects into relational databases, but the takeaway from Houlihan's experiments is clear. Relational databases suffer from performance limitations of wide rows and large data attributes, while a document database such as MongoDB takes them in its stride.</p> <p>Relational databases are valuable, and important, of course, but Houlihan's message is you need to use the right tool for the right task. And, when it comes to generative AI, the right tool is a document database like MongoDB.</p> <p>Houlian published his work in a <a href="https://github.com/rhoulihan/BSON-JSON-bakeoff" target="_blank" rel="nofollow noopener">GitHub repo</a> for the world to see and for anyone to validate for themselves.</p> <p>It's not simply an experiment; when it comes to MongoDB and GenAI "there are companies doing real things, with real impact," Houlihan said. He cites an example as <a href="https://pathfinderlabs.nz/" target="_blank" rel="nofollow noopener">Pathfinder</a>, an organisation which uncovers cybercrime evidence, collates it, and then uses AI to find similarities and identify perpetrators of such evil as human trafficking and exploitation.</p> <p>Another is <a href="https://www.novonordisk.com/" target="_blank" rel="nofollow noopener">Nova Nordisk</a> leveraging MongoDB and Generative AI to improve health care and advance medical treatment for common diseases like diabetes and cancer. Reducing the time required to compile clinical research reports for regulatory approval of new pharmaceuticals from 12 weeks to just 10 minutes has empowered their business to do more with less.</p> <p>Meanwhile, MongoDB has a big Australian connection, Houlihan explained. While MongoDB is available as a free product, it also comes as a managed service called <a href="https://www.mongodb.com/products/platform/atlas-database" target="_blank" rel="nofollow noopener">Atlas</a>. A new feature, <a href="https://www.mongodb.com/docs/charts/" target="_blank" rel="nofollow noopener">Atlas Charts</a>, uses natural language to easily visualise data and help end users self-service to get meaningful, actionable insights into their data without having to wait upon specialised BI developers being freed up.</p> <p>Atlas Charts is the work of MongoDB's Sydney-based engineering team. This team has also been a big part of the company's <a href="https://www.mongodb.com/products/tools/relational-migrator" target="_blank" rel="nofollow noopener">Relational Migrator</a>, a service that uses generative AI to help organisations migrate relational databases to MongoDB, among many other projects.</p> <p>Houlihan previously spent time as the first technical product manager for <a href="https://aws.amazon.com/documentdb/" target="_blank" rel="nofollow noopener">AWS DocumentDB</a>, building a NoSQL centre of excellence there. Now he is at MongoDB and taking his big ideas further.</p> <p>What attracted him to MongoDB was how the product "wraps functionality behind a unified API where devs don't have to learn five or six different tech stacks. We don't reinvent the wheel; we invest in the core service and then add the best-of-breed from the industry into the product." An example is <a href="https://lucene.apache.org/" target="_blank" rel="nofollow noopener">Lucene</a>; it's the most popular full-text search engine, and the same one that backs Elastic, among other popular products. "So, we built Lucene into our own API to reduce developer overhead in working with the data."</p> <p>"On top of that," he said, "our founders had geographical data distribution in mind from the beginning. The relational database is not built with geographic distribution in mind. Layers can be added on top like log shipping for Postgres, or Golden Gate for Oracle, but it's often left for the developer to solve."</p> <p>By contrast, "it's a first-class citizen in MongoDB. We combine the flexibility of the document model with the ability to determine on each individual write how the data should be replicated and what level of consistency is required. It's a really novel way of working with data and how to store and access it."</p> <p>"We work with the largest financial institutions in the world. These companies run extremely high-velocity trading and payment processing applications; the tech we provide for those kinds of workloads drives an enormous amount of efficiency," Houlihan said.</p></div> How AI is Transforming the Banking Industry 2024-08-21T11:50:52+10:00 2024-08-21T11:50:52+10:00 https://itwire.com/business-it-news/data/how-ai-is-transforming-the-banking-industry.html Guest Writer stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/ad35931152008a2bb314386907a07555_S.jpg" alt="How AI is Transforming the Banking Industry" /></div><div class="K2FeedIntroText"><p>Artificial intelligence is changing the banking business, with major implications for both traditional and new banks. This shift in the banking industry from traditional, data-driven AI to sophisticated generative AI which offers unprecedented levels of efficiency and customer engagement. In the banking industry,&nbsp;generative AI has the potential to increase productivity by 5% and decrease worldwide spending by $300 billion, according to McKinsey's 2023 banking report.&nbsp;</p> </div><div class="K2FeedFullText"> <p>However, that only represents a portion of the overall picture. The integration of AI in banking apps and services has made the industry more customer-centric and digitally relevant. Because AI-based systems are more productive and make judgments based on data that is incomprehensible to humans, banks are now able to reduce costs.&nbsp;</p> <p>Furthermore, sophisticated algorithms are able to identify misleading data in a couple of seconds. This article will examine how AI is bringing changes that can help revolutionize the banking industry.&nbsp;</p> <p>How Banking Has Evolved With Artificial Intelligence</p> <p>AI's role in banking started with data analysis and task automation, but it has since grown to include complex applications in risk management, fraud prevention, and customized customer care. The emergence of <a href="https://www.debutinfotech.com/generative-ai-development-company" target="_blank" rel="noopener"><strong>generative AI development</strong></a>, which can create and anticipate using vast amounts of data, is a significant breakthrough that could further alter banking operations and strategy.&nbsp;</p> <p>These are not the only uses of Artificial Intelligence in banking. Prioritizing risk management, process organization, and security has always been a top priority for traditional banks; yet, until recently, customer satisfaction and involvement have been lacking.&nbsp;</p> <p>Nearly 80% of banks, according to a Business Insider report, are aware of the possible advantages of artificial intelligence in banking. According to a different McKinsey analysis, <a href="https://www.debutinfotech.com/blog/generative-ai-in-finance-and-banking" target="_blank" rel="noopener"><strong>artificial intelligence in banking and finance</strong></a> might reach $1 trillion in potential.</p> <p><strong>Applications Of AI In Banking</strong><img src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXdCq80WP7Rn6ustRO-A-rxA-FHTvmGjYH8kdq_nFfYGGJNiFjJzPq49tK2WJrrO-A6a6Za8CZNNfK9G-Xo2djiMZSKb0cV7y0CLy9KITuc4cAKNbVyzJKbrxG280EESAC__vqs03QMKdciftskgccSi56kt?key=JYiuHHNFZTeemblLMGLwFw" alt="" width="463" height="253" style="display: block; margin-left: auto; margin-right: auto;" /></p> <p>Our world has increasingly relied on artificial intelligence, and banks have already begun incorporating this technology into their offerings. The following are some significant uses of AI in the banking sector:</p> <p><strong>Fraud Detection and Cybersecurity</strong></p> <p>Artificial Intelligence in the banking sector is increasingly being used to detect and manage cyber threats in the banking industry. By utilizing machine learning and AI, banks can detect fraudulent activity, monitor system vulnerabilities, reduce risks, and enhance overall security.&nbsp;</p> <p>For instance, Denmark's largest bank, Danske Bank, has successfully detected fraud 50% more often and identified false positives 60% less frequently. AI also helps banks respond to potential intrusions before they impact staff, clients, or internal systems. This blockchain technology is crucial for enhancing the overall security of online banking.</p> <p><strong>Loans and Credit</strong></p> <p>Banks are using AI-based systems to make more informed, safer, and profitable loan and credit decisions. Currently, these systems rely on credit history, scores, and customer references to determine creditworthiness.&nbsp;</p> <p>However, these systems are often flawed due to errors and misclassification of creditors. AI-based systems can analyze customer behaviour and patterns, determine creditworthiness, and send warnings to banks about behaviours that may increase default chances. These technologies are significantly changing consumer lending.</p> <p><strong>Market Trends</strong></p> <p>AI-ML in financial services helps banks process large volumes of data and predict the latest market trends. Advanced machine learning techniques help evaluate market sentiments and suggest investment options.</p> <p>AI solutions for banking also suggest the best time to invest in stocks and warn when there is a potential risk. Due to its high data processing capacity, this emerging technology also helps speed up decision-making and makes trading convenient for banks and their clients.</p> <p><strong>Chatbots</strong></p> <p>Artificial intelligence is used in banking practically with chatbots, which are available around the clock and can recognize consumer behaviour patterns. Chatbots can be integrated into banking apps to provide individualized support, lessen workloads, and recommend appropriate financial services.&nbsp;</p> <p>In 2019, Erica, a virtual assistant for Bank of America, efficiently managed more than 50 million client requests, including credit card debt reduction and card security upgrades.</p> <p><strong>Data Collection and Analysis</strong></p> <p>Every day, banking and financial organizations log millions of transactions. Employees find it daunting to collect and register the massive amount of information generated. Organizing and recording such a large volume of data becomes impossible.</p> <p>In these situations, creative AI development companies help collect and analyze data effectively. The data can also be utilized to determine creditworthiness or identify fraud.</p> <p><strong>Tracking Market Trends</strong></p> <p>In the financial services industry, AI-ML enables institutions to process massive amounts of data and forecast current market trends. Sophisticated machine learning methods aid in assessing market mood and making investment recommendations.</p> <p>AI banking solutions also recommend the optimal time to buy equities and flag any dangers. This cutting-edge technology facilitates convenient trading for banks and their clients and helps expedite decision-making due to its high data processing capability.</p> <p><strong>Risk Management</strong></p> <p>The banking and financial industries are significantly impacted by external global events such as political upheaval, natural disasters, and currency changes. Using AI services in banking allows you to stay organized and make timely decisions by providing insights that paint a fairly clear picture of what's ahead.</p> <p>By calculating the likelihood that a client won't repay a loan, AI for banking also assists in identifying riskier applications. It uses smartphone data and historical behavioural patterns to forecast future behaviour.</p> <p><strong>Customer Experience</strong></p> <p>ATMs are a popular way for people to deposit and withdraw money during non-working hours since customers are looking for better experiences and convenience more and more. As a result, artificial intelligence has been incorporated into banking and finance services, improving customer satisfaction and cutting down on the time needed to record Know Your Customer (KYC) data.</p> <p>Additionally, AI technology automates credit qualifying and personal loan approval procedures, speeding up approval times and guaranteeing a positive client experience. Furthermore, AI in banking customer support reliably gathers user data for account setup.</p> <p><strong>Regulatory Compliance</strong></p> <p>Governments enforce strict regulations on the banking industry, ensuring banks don't commit financial crimes and maintain appropriate risk profiles to prevent significant defaults. Internal compliance teams are common in banks, but manual procedures are expensive and time-consuming.&nbsp;</p> <p>Compliance regulations are always changing and requiring adjustments. Deep learning, natural language processing (NLP), artificial intelligence (AI), and machine learning (ML) can enhance decision-making by interpreting new needs. Although AI cannot replace compliance analysts, it can speed up and improve the effectiveness of banking operations.</p> <p><strong>Predictive Analytics</strong></p> <p>Predictive analytics with broad application and general-purpose semantic and natural language applications are two of the most prevalent use cases of AI in the banking sector. Particular patterns and connections in the data can be found by AI that were previously undetectable by conventional technologies.&nbsp;</p> <p>These patterns might point to unexplored prospects for cross-selling or sales or even point to operational data indicators, which would directly affect revenue.</p> <p><strong>Process Automation</strong></p> <p>Robotic process automation (RPA) algorithms automate repetitive, time-consuming operations, increasing operational efficiency and accuracy while lowering costs. Additionally, it frees up users to concentrate on more intricate procedures involving human intervention.</p> <p>Banking organizations are currently using RPA to improve efficiency and transaction speed. For instance, JPMorgan Chase's CoiN technology analyzes papers far more quickly than a human could and extracts data from them.</p> <p><strong>Challenges and Setbacks In Adopting AI in Banking</strong></p> <p>There are challenges in widely implementing cutting-edge technologies like artificial intelligence. Banks that use AI technologies face a number of difficulties, including security concerns and a shortage of reliable and high-quality data. Let's look at them now.</p> <p><strong>Data Security</strong></p> <p>The banking sector collects enormous amounts of data, so strong security measures are required to prevent breaches. To ensure that your client data is managed properly, it is crucial to choose the correct technology partner who is knowledgeable in both AI and banking and can provide a range of security choices.</p> <p><strong>Lack of Quality Data</strong></p> <p>Before implementing a comprehensive AI-based banking system, banks require high-quality, organized data for training and validation to guarantee that the method is applicable to real-world scenarios.</p> <p>Furthermore, non-machine-readable data could cause unexpected behaviour from AI models. Banks must, therefore, update their data rules to reduce any privacy and regulatory issues as they deploy AI services.</p> <p><strong>Lack of Explainability</strong></p> <p>While AI-based decision support systems are useful, they may also display biases resulting from previous problems with human judgment. To avoid these kinds of problems, banks should guarantee that all AI-generated choices and recommendations can be adequately explained, validated, and shown to follow the model's decision-making process.</p> <p><strong>Conclusion</strong></p> <p>Artificial Intelligence in banking is causing a major shift in how banks function, provide services, and interact with clients. Through process optimization, improved client experiences, risk mitigation, and data-driven decision-making, artificial intelligence (AI) is helping banks prepare themselves for success in the digital era.</p> <p>Banks will obtain a competitive edge, increase operational effectiveness, and provide new, tailored services that cater to their clients' changing demands if they use AI and incorporate it into their strategy.</p> <p>Banks must, however, approach the deployment of AI with a responsible and strategic mentality, addressing issues with data protection, moral <a href="https://www.debutinfotech.com/ai-development-company" target="_blank" rel="noopener"><strong>AI development services</strong></a>, and employment effects. Banks can fully utilize the revolutionary potential of artificial intelligence and influence the direction of the financial services sector by finding the ideal balance between innovation and responsible AI adoption.</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/ad35931152008a2bb314386907a07555_S.jpg" alt="How AI is Transforming the Banking Industry" /></div><div class="K2FeedIntroText"><p>Artificial intelligence is changing the banking business, with major implications for both traditional and new banks. This shift in the banking industry from traditional, data-driven AI to sophisticated generative AI which offers unprecedented levels of efficiency and customer engagement. In the banking industry,&nbsp;generative AI has the potential to increase productivity by 5% and decrease worldwide spending by $300 billion, according to McKinsey's 2023 banking report.&nbsp;</p> </div><div class="K2FeedFullText"> <p>However, that only represents a portion of the overall picture. The integration of AI in banking apps and services has made the industry more customer-centric and digitally relevant. Because AI-based systems are more productive and make judgments based on data that is incomprehensible to humans, banks are now able to reduce costs.&nbsp;</p> <p>Furthermore, sophisticated algorithms are able to identify misleading data in a couple of seconds. This article will examine how AI is bringing changes that can help revolutionize the banking industry.&nbsp;</p> <p>How Banking Has Evolved With Artificial Intelligence</p> <p>AI's role in banking started with data analysis and task automation, but it has since grown to include complex applications in risk management, fraud prevention, and customized customer care. The emergence of <a href="https://www.debutinfotech.com/generative-ai-development-company" target="_blank" rel="noopener"><strong>generative AI development</strong></a>, which can create and anticipate using vast amounts of data, is a significant breakthrough that could further alter banking operations and strategy.&nbsp;</p> <p>These are not the only uses of Artificial Intelligence in banking. Prioritizing risk management, process organization, and security has always been a top priority for traditional banks; yet, until recently, customer satisfaction and involvement have been lacking.&nbsp;</p> <p>Nearly 80% of banks, according to a Business Insider report, are aware of the possible advantages of artificial intelligence in banking. According to a different McKinsey analysis, <a href="https://www.debutinfotech.com/blog/generative-ai-in-finance-and-banking" target="_blank" rel="noopener"><strong>artificial intelligence in banking and finance</strong></a> might reach $1 trillion in potential.</p> <p><strong>Applications Of AI In Banking</strong><img src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXdCq80WP7Rn6ustRO-A-rxA-FHTvmGjYH8kdq_nFfYGGJNiFjJzPq49tK2WJrrO-A6a6Za8CZNNfK9G-Xo2djiMZSKb0cV7y0CLy9KITuc4cAKNbVyzJKbrxG280EESAC__vqs03QMKdciftskgccSi56kt?key=JYiuHHNFZTeemblLMGLwFw" alt="" width="463" height="253" style="display: block; margin-left: auto; margin-right: auto;" /></p> <p>Our world has increasingly relied on artificial intelligence, and banks have already begun incorporating this technology into their offerings. The following are some significant uses of AI in the banking sector:</p> <p><strong>Fraud Detection and Cybersecurity</strong></p> <p>Artificial Intelligence in the banking sector is increasingly being used to detect and manage cyber threats in the banking industry. By utilizing machine learning and AI, banks can detect fraudulent activity, monitor system vulnerabilities, reduce risks, and enhance overall security.&nbsp;</p> <p>For instance, Denmark's largest bank, Danske Bank, has successfully detected fraud 50% more often and identified false positives 60% less frequently. AI also helps banks respond to potential intrusions before they impact staff, clients, or internal systems. This blockchain technology is crucial for enhancing the overall security of online banking.</p> <p><strong>Loans and Credit</strong></p> <p>Banks are using AI-based systems to make more informed, safer, and profitable loan and credit decisions. Currently, these systems rely on credit history, scores, and customer references to determine creditworthiness.&nbsp;</p> <p>However, these systems are often flawed due to errors and misclassification of creditors. AI-based systems can analyze customer behaviour and patterns, determine creditworthiness, and send warnings to banks about behaviours that may increase default chances. These technologies are significantly changing consumer lending.</p> <p><strong>Market Trends</strong></p> <p>AI-ML in financial services helps banks process large volumes of data and predict the latest market trends. Advanced machine learning techniques help evaluate market sentiments and suggest investment options.</p> <p>AI solutions for banking also suggest the best time to invest in stocks and warn when there is a potential risk. Due to its high data processing capacity, this emerging technology also helps speed up decision-making and makes trading convenient for banks and their clients.</p> <p><strong>Chatbots</strong></p> <p>Artificial intelligence is used in banking practically with chatbots, which are available around the clock and can recognize consumer behaviour patterns. Chatbots can be integrated into banking apps to provide individualized support, lessen workloads, and recommend appropriate financial services.&nbsp;</p> <p>In 2019, Erica, a virtual assistant for Bank of America, efficiently managed more than 50 million client requests, including credit card debt reduction and card security upgrades.</p> <p><strong>Data Collection and Analysis</strong></p> <p>Every day, banking and financial organizations log millions of transactions. Employees find it daunting to collect and register the massive amount of information generated. Organizing and recording such a large volume of data becomes impossible.</p> <p>In these situations, creative AI development companies help collect and analyze data effectively. The data can also be utilized to determine creditworthiness or identify fraud.</p> <p><strong>Tracking Market Trends</strong></p> <p>In the financial services industry, AI-ML enables institutions to process massive amounts of data and forecast current market trends. Sophisticated machine learning methods aid in assessing market mood and making investment recommendations.</p> <p>AI banking solutions also recommend the optimal time to buy equities and flag any dangers. This cutting-edge technology facilitates convenient trading for banks and their clients and helps expedite decision-making due to its high data processing capability.</p> <p><strong>Risk Management</strong></p> <p>The banking and financial industries are significantly impacted by external global events such as political upheaval, natural disasters, and currency changes. Using AI services in banking allows you to stay organized and make timely decisions by providing insights that paint a fairly clear picture of what's ahead.</p> <p>By calculating the likelihood that a client won't repay a loan, AI for banking also assists in identifying riskier applications. It uses smartphone data and historical behavioural patterns to forecast future behaviour.</p> <p><strong>Customer Experience</strong></p> <p>ATMs are a popular way for people to deposit and withdraw money during non-working hours since customers are looking for better experiences and convenience more and more. As a result, artificial intelligence has been incorporated into banking and finance services, improving customer satisfaction and cutting down on the time needed to record Know Your Customer (KYC) data.</p> <p>Additionally, AI technology automates credit qualifying and personal loan approval procedures, speeding up approval times and guaranteeing a positive client experience. Furthermore, AI in banking customer support reliably gathers user data for account setup.</p> <p><strong>Regulatory Compliance</strong></p> <p>Governments enforce strict regulations on the banking industry, ensuring banks don't commit financial crimes and maintain appropriate risk profiles to prevent significant defaults. Internal compliance teams are common in banks, but manual procedures are expensive and time-consuming.&nbsp;</p> <p>Compliance regulations are always changing and requiring adjustments. Deep learning, natural language processing (NLP), artificial intelligence (AI), and machine learning (ML) can enhance decision-making by interpreting new needs. Although AI cannot replace compliance analysts, it can speed up and improve the effectiveness of banking operations.</p> <p><strong>Predictive Analytics</strong></p> <p>Predictive analytics with broad application and general-purpose semantic and natural language applications are two of the most prevalent use cases of AI in the banking sector. Particular patterns and connections in the data can be found by AI that were previously undetectable by conventional technologies.&nbsp;</p> <p>These patterns might point to unexplored prospects for cross-selling or sales or even point to operational data indicators, which would directly affect revenue.</p> <p><strong>Process Automation</strong></p> <p>Robotic process automation (RPA) algorithms automate repetitive, time-consuming operations, increasing operational efficiency and accuracy while lowering costs. Additionally, it frees up users to concentrate on more intricate procedures involving human intervention.</p> <p>Banking organizations are currently using RPA to improve efficiency and transaction speed. For instance, JPMorgan Chase's CoiN technology analyzes papers far more quickly than a human could and extracts data from them.</p> <p><strong>Challenges and Setbacks In Adopting AI in Banking</strong></p> <p>There are challenges in widely implementing cutting-edge technologies like artificial intelligence. Banks that use AI technologies face a number of difficulties, including security concerns and a shortage of reliable and high-quality data. Let's look at them now.</p> <p><strong>Data Security</strong></p> <p>The banking sector collects enormous amounts of data, so strong security measures are required to prevent breaches. To ensure that your client data is managed properly, it is crucial to choose the correct technology partner who is knowledgeable in both AI and banking and can provide a range of security choices.</p> <p><strong>Lack of Quality Data</strong></p> <p>Before implementing a comprehensive AI-based banking system, banks require high-quality, organized data for training and validation to guarantee that the method is applicable to real-world scenarios.</p> <p>Furthermore, non-machine-readable data could cause unexpected behaviour from AI models. Banks must, therefore, update their data rules to reduce any privacy and regulatory issues as they deploy AI services.</p> <p><strong>Lack of Explainability</strong></p> <p>While AI-based decision support systems are useful, they may also display biases resulting from previous problems with human judgment. To avoid these kinds of problems, banks should guarantee that all AI-generated choices and recommendations can be adequately explained, validated, and shown to follow the model's decision-making process.</p> <p><strong>Conclusion</strong></p> <p>Artificial Intelligence in banking is causing a major shift in how banks function, provide services, and interact with clients. Through process optimization, improved client experiences, risk mitigation, and data-driven decision-making, artificial intelligence (AI) is helping banks prepare themselves for success in the digital era.</p> <p>Banks will obtain a competitive edge, increase operational effectiveness, and provide new, tailored services that cater to their clients' changing demands if they use AI and incorporate it into their strategy.</p> <p>Banks must, however, approach the deployment of AI with a responsible and strategic mentality, addressing issues with data protection, moral <a href="https://www.debutinfotech.com/ai-development-company" target="_blank" rel="noopener"><strong>AI development services</strong></a>, and employment effects. Banks can fully utilize the revolutionary potential of artificial intelligence and influence the direction of the financial services sector by finding the ideal balance between innovation and responsible AI adoption.</p></div> 5 ways to encourage more use of data in your company 2024-08-19T16:53:08+10:00 2024-08-19T16:53:08+10:00 https://itwire.com/business-it-news/data/5-ways-to-encourage-more-use-of-data-in-your-company.html Cliff Stanton stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/a3fde8616eff43e06ae2237ec94504fa_S.jpg" alt="5 ways to encourage more use of data in your company" /></div><div class="K2FeedIntroText"><p>GUEST OPINION: Data generation has accelerated in recent years with the rise of faster internet connectivity and generative AI.</p> </div><div class="K2FeedFullText"> <p>Despite the rapid generation, gathering, and accumulation of data, it appears most enterprises are not making good use of the information available to them. One study shows that <a href="https://www.seagate.com/em/en/our-story/rethink-data/" target="_blank" rel="noopener">around 68% of enterprise data</a> remains unutilized.</p> <p>Organizations store massive amounts of information, but they are not leveraging it to stimulate innovation, <a href="https://itwire.com/business-it-news/data/big-data-analytics-for-business-transforming-raw-data-into-insights.html" target="_blank" rel="noopener">improve products and services</a>, and enhance marketing and customer relations. If this is going to change, then everyone in an organization will have a role to play, as detailed in the following approaches.</p> <h2>1. Elimination of information silos</h2> <p>Information silos occur when data or knowledge is kept isolated or made exclusively available to certain teams, departments, or systems.</p> <p>While silos may be purposely created for security or privacy reasons, in most instances, they just emerge because of poor data handling or outdated cultures. Unintentional information silos must be dismantled to make data available to those who can extract value from them.</p> <p>Organizations should carefully examine their legacy systems for potential silos. It is also advisable to review departmental information strategies and workplace cultures to address siloing and empower everyone to use data for productive ends.</p> <p>Widening access to data safely can be challenging, so it helps to use solutions like <a href="https://www.informatica.com/" target="_blank" rel="noopener">Informatica</a>, a comprehensive AI-powered data governance and access control solution that helps teams to manage data across multiple clouds and hybrid environments. It helps address data isolation and brings together disparate data systems with its data governing, cataloging, and integration capabilities.</p> <h2>2. Promoting a data-driven workplace culture</h2> <p>Implementing a culture of data use in an organization does not happen overnight. Merely directing employees to “use data” does not build a data-driven culture unto itself. It requires a systematic approach and consistency.</p> <p>One of the key elements in successfully promoting a data-driven culture is leadership. Managers, supervisors, and others in leadership positions should demonstrate and encourage data-driven decision-making. It is important to lead by example, by regularly citing data during meetings and requiring reports to reference relevant metrics and qualitative information whenever applicable.</p> <p>In addition to leadership commitment, it is important to establish a clear framework and process for data use. Relying on employee initiative is likewise not enough. It is advisable to communicate standards when it comes to data formats, relevance, accuracy, and understandability. Employees should be encouraged to learn to transform raw data into structured and contextualized information in line with the established standards.</p> <p>Moreover, organizations should make the process of building a data-driven workplace culture a collaborative effort. Allowing employees to have a say on the framework or procedures helps to get them more committed to the goal.</p> <p>It also helps to establish clear objectives, to easily monitor and quantify the progress toward building a data-driven culture. For example, managers can keep track of the use of relevant data in official documents or communications to check if employees are already building the habit of data use. Managers can also monitor employees’ proficiency in using relevant software tools.</p> <h2>3. Using intuitive AI-enhanced BI tools</h2> <p>With the advent of AI-powered business intelligence software, anyone can perform data analysis or conduct business intelligence without formal training. It’s easier than ever to help resolve problems or arrive at evidence-backed business decisions.</p> <p>Organizations can invest in intuitive business intelligence software augmented by Natural Language Processing (NLP) and Large Language Models (LLM) that make them significantly easier to use but without being too rudimentary with their outputs.</p> <p>A good example of an intuitive business intelligence solution is <a href="https://www.pyramidanalytics.com/" target="_blank" rel="noopener">Pyramid Analytics</a>, which brings together the advantages of advanced data analytics and generative artificial intelligence (GenAI) to produce generative business intelligence (GenBI). It allows anyone to interact with data without having in-depth knowledge about data querying and analysis. Users can ask Pyramid to generate insights or produce interactive data presentations through text or voice-based communication.</p> <p>One challenge with using LLM-enhanced business intelligence solutions, however, is the potential for data leaks. As shown by what happened with ChatGPT, it is possible for the AI systems to reveal information used in training them or details they obtain as they interact with users. As such, it is advisable to carefully examine an LLM-aided BI tool. Preferably, it should not grant LLMs direct access to enterprise data.</p> <p>Pyramid addresses this issue by adding a layer to the underlying tech, allowing its system to send queries to the LLM of the user’s choice without actually sharing any data with the LLM – all of the analysis and visualization itself happens within the safety of the installed software.</p> <h2>4. Providing data analysis and management courses</h2> <p>There is no excuse for being unable to provide line-of-business team members with training on how to utilize data for their own strategic purposes. There are many data training modules available online. Organizations can partner with online training platforms like <a href="https://www.udemy.com/topic/data-analysis/" target="_blank" rel="noopener">Udemy</a>.</p> <p>Also, you can conduct your own custom training sessions and develop customized data analysis training for different departments to address specific needs. To encourage participation in the training, organizations can inject gamification elements or offer incentives.</p> <p>Data analysis courses provide everyone in the organization with the foundation they need to have a workable understanding of using reports and dashboards. These courses also empower employees to contribute ideas in building the organization’s data use frameworks. The knowledge they gain from the courses allows them to examine, for example, if the requirements and processes result in greater efficiency or if they’re simply an unnecessary burden.</p> <p>Before picking a course or developing custom training modules, organizations need to ascertain that the training they provide is in line with their operational and business goals. The training should result in useful skills for “citizen data analysis” and the maximization of data used to promote innovation and improvements in business outcomes.</p> <h2>5. Implementing data quality measures to build trust in data</h2> <p>It is difficult to promote data use if the data available in an organization is unreliable. That’s why it is crucial to establish data quality standards. All the data maintained in an organization should be accurate, complete, consistent, and relevant. The data stored should be properly labelled and structured. Timestamps should also be correctly generated to make it easy to find the most recently uploaded information.</p> <p>Organizations need a good data management system to ascertain that they only keep and use accurate and relevant data. Unnecessary duplicates must be removed to maximize storage space. Updated data should be properly labeled, and old or obsolete data must be erased unless they are deemed necessary for historical contextualization.</p> <p>In some cases, it helps to appoint a data steward to resolve data conflicts or perform enrichment to find missing information or fix incomplete datasets.</p> <p>Organizations can use tools like <a href="https://www.datafold.com/" target="_blank" rel="noopener">Datafold</a> to automate data testing and quickly spot data quality issues. The platform can perform sophisticated reconciliations between versions of the same data set. It also executes continuous integration and deployment (CI/CD) testing, enabling users to maximize data visibility and obtain insights into the impact of code changes.</p> <h2>Conclusion</h2> <p>Data should be an asset for businesses. It is unfortunate that not enough organizations take advantage of their data repositories, resulting in unexplored or untapped potential. Teams can use data to improve decision-making, products and services, marketing and sales efforts, customer relations, and many other areas. To get started with productive data use, it helps to demolish information silos, establish a data-driven workplace, use AI-enhanced business intelligence solutions, provide data analysis and management courses, and implement data quality measures to build data trustworthiness.</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/a3fde8616eff43e06ae2237ec94504fa_S.jpg" alt="5 ways to encourage more use of data in your company" /></div><div class="K2FeedIntroText"><p>GUEST OPINION: Data generation has accelerated in recent years with the rise of faster internet connectivity and generative AI.</p> </div><div class="K2FeedFullText"> <p>Despite the rapid generation, gathering, and accumulation of data, it appears most enterprises are not making good use of the information available to them. One study shows that <a href="https://www.seagate.com/em/en/our-story/rethink-data/" target="_blank" rel="noopener">around 68% of enterprise data</a> remains unutilized.</p> <p>Organizations store massive amounts of information, but they are not leveraging it to stimulate innovation, <a href="https://itwire.com/business-it-news/data/big-data-analytics-for-business-transforming-raw-data-into-insights.html" target="_blank" rel="noopener">improve products and services</a>, and enhance marketing and customer relations. If this is going to change, then everyone in an organization will have a role to play, as detailed in the following approaches.</p> <h2>1. Elimination of information silos</h2> <p>Information silos occur when data or knowledge is kept isolated or made exclusively available to certain teams, departments, or systems.</p> <p>While silos may be purposely created for security or privacy reasons, in most instances, they just emerge because of poor data handling or outdated cultures. Unintentional information silos must be dismantled to make data available to those who can extract value from them.</p> <p>Organizations should carefully examine their legacy systems for potential silos. It is also advisable to review departmental information strategies and workplace cultures to address siloing and empower everyone to use data for productive ends.</p> <p>Widening access to data safely can be challenging, so it helps to use solutions like <a href="https://www.informatica.com/" target="_blank" rel="noopener">Informatica</a>, a comprehensive AI-powered data governance and access control solution that helps teams to manage data across multiple clouds and hybrid environments. It helps address data isolation and brings together disparate data systems with its data governing, cataloging, and integration capabilities.</p> <h2>2. Promoting a data-driven workplace culture</h2> <p>Implementing a culture of data use in an organization does not happen overnight. Merely directing employees to “use data” does not build a data-driven culture unto itself. It requires a systematic approach and consistency.</p> <p>One of the key elements in successfully promoting a data-driven culture is leadership. Managers, supervisors, and others in leadership positions should demonstrate and encourage data-driven decision-making. It is important to lead by example, by regularly citing data during meetings and requiring reports to reference relevant metrics and qualitative information whenever applicable.</p> <p>In addition to leadership commitment, it is important to establish a clear framework and process for data use. Relying on employee initiative is likewise not enough. It is advisable to communicate standards when it comes to data formats, relevance, accuracy, and understandability. Employees should be encouraged to learn to transform raw data into structured and contextualized information in line with the established standards.</p> <p>Moreover, organizations should make the process of building a data-driven workplace culture a collaborative effort. Allowing employees to have a say on the framework or procedures helps to get them more committed to the goal.</p> <p>It also helps to establish clear objectives, to easily monitor and quantify the progress toward building a data-driven culture. For example, managers can keep track of the use of relevant data in official documents or communications to check if employees are already building the habit of data use. Managers can also monitor employees’ proficiency in using relevant software tools.</p> <h2>3. Using intuitive AI-enhanced BI tools</h2> <p>With the advent of AI-powered business intelligence software, anyone can perform data analysis or conduct business intelligence without formal training. It’s easier than ever to help resolve problems or arrive at evidence-backed business decisions.</p> <p>Organizations can invest in intuitive business intelligence software augmented by Natural Language Processing (NLP) and Large Language Models (LLM) that make them significantly easier to use but without being too rudimentary with their outputs.</p> <p>A good example of an intuitive business intelligence solution is <a href="https://www.pyramidanalytics.com/" target="_blank" rel="noopener">Pyramid Analytics</a>, which brings together the advantages of advanced data analytics and generative artificial intelligence (GenAI) to produce generative business intelligence (GenBI). It allows anyone to interact with data without having in-depth knowledge about data querying and analysis. Users can ask Pyramid to generate insights or produce interactive data presentations through text or voice-based communication.</p> <p>One challenge with using LLM-enhanced business intelligence solutions, however, is the potential for data leaks. As shown by what happened with ChatGPT, it is possible for the AI systems to reveal information used in training them or details they obtain as they interact with users. As such, it is advisable to carefully examine an LLM-aided BI tool. Preferably, it should not grant LLMs direct access to enterprise data.</p> <p>Pyramid addresses this issue by adding a layer to the underlying tech, allowing its system to send queries to the LLM of the user’s choice without actually sharing any data with the LLM – all of the analysis and visualization itself happens within the safety of the installed software.</p> <h2>4. Providing data analysis and management courses</h2> <p>There is no excuse for being unable to provide line-of-business team members with training on how to utilize data for their own strategic purposes. There are many data training modules available online. Organizations can partner with online training platforms like <a href="https://www.udemy.com/topic/data-analysis/" target="_blank" rel="noopener">Udemy</a>.</p> <p>Also, you can conduct your own custom training sessions and develop customized data analysis training for different departments to address specific needs. To encourage participation in the training, organizations can inject gamification elements or offer incentives.</p> <p>Data analysis courses provide everyone in the organization with the foundation they need to have a workable understanding of using reports and dashboards. These courses also empower employees to contribute ideas in building the organization’s data use frameworks. The knowledge they gain from the courses allows them to examine, for example, if the requirements and processes result in greater efficiency or if they’re simply an unnecessary burden.</p> <p>Before picking a course or developing custom training modules, organizations need to ascertain that the training they provide is in line with their operational and business goals. The training should result in useful skills for “citizen data analysis” and the maximization of data used to promote innovation and improvements in business outcomes.</p> <h2>5. Implementing data quality measures to build trust in data</h2> <p>It is difficult to promote data use if the data available in an organization is unreliable. That’s why it is crucial to establish data quality standards. All the data maintained in an organization should be accurate, complete, consistent, and relevant. The data stored should be properly labelled and structured. Timestamps should also be correctly generated to make it easy to find the most recently uploaded information.</p> <p>Organizations need a good data management system to ascertain that they only keep and use accurate and relevant data. Unnecessary duplicates must be removed to maximize storage space. Updated data should be properly labeled, and old or obsolete data must be erased unless they are deemed necessary for historical contextualization.</p> <p>In some cases, it helps to appoint a data steward to resolve data conflicts or perform enrichment to find missing information or fix incomplete datasets.</p> <p>Organizations can use tools like <a href="https://www.datafold.com/" target="_blank" rel="noopener">Datafold</a> to automate data testing and quickly spot data quality issues. The platform can perform sophisticated reconciliations between versions of the same data set. It also executes continuous integration and deployment (CI/CD) testing, enabling users to maximize data visibility and obtain insights into the impact of code changes.</p> <h2>Conclusion</h2> <p>Data should be an asset for businesses. It is unfortunate that not enough organizations take advantage of their data repositories, resulting in unexplored or untapped potential. Teams can use data to improve decision-making, products and services, marketing and sales efforts, customer relations, and many other areas. To get started with productive data use, it helps to demolish information silos, establish a data-driven workplace, use AI-enhanced business intelligence solutions, provide data analysis and management courses, and implement data quality measures to build data trustworthiness.</p></div> From SEO to LMO: HubSpot launches the first free tool for AI discovery 2024-08-15T10:54:33+10:00 2024-08-15T10:54:33+10:00 https://itwire.com/business-it-news/data/from-seo-to-lmo-hubspot-launches-the-first-free-tool-for-ai-discovery.html HubSpot stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/d91f373d7922e7ebe32cc87bab104a2b_S.jpg" alt="From SEO to LMO: HubSpot launches the first free tool for AI discovery" /></div><div class="K2FeedIntroText"><p>PRODUCT ANNOUNCEMENT:&nbsp; AI Search Grader analyses brand awareness and sentiment across AI chatbots as more customers turn to tools like ChatGPT for search</p> </div><div class="K2FeedFullText"> <p>AI Search Grader analyses brand awareness and sentiment across AI chatbots as more customers turn to tools like ChatGPT for search</p> <p>Customers and prospects are ditching traditional search in favour of researching with AI for quick, personalised answers. In fact, usage of tools like ChatGPT to answer questions is up 37%, while search engines are down 11%.* For marketers who’ve been focused on getting their brands discovered with SEO, now is the time to master a new craft: Language Model Optimisation (LMO).</p> <p>At HubSpot, we’ve always given marketers the tools they need to connect with their customers and drive growth. We did it with our Website Grader that helped millions of marketers optimise for search engines. And we’re doing it today with the launch of AI Search Grader, the first free tool that helps brands understand how they show up in Large Language Models (LLM) and AI search.</p> <p>“What marketers have been doing for years to attract customers won’t be as effective in the future. People just aren’t using traditional search like they used to,”<strong>&nbsp;said Nicholas Holland, VP of Product and GM of Marketing Hub at <a href="https://www.hubspot.com/" target="_blank" rel="noopener">HubSpot</a>.&nbsp;</strong>“As the tides are turning toward AI search, we want to help marketers stand out and get discovered in this new environment. That’s exactly what AI Search Grader will give marketers – a new, free way to connect with their customers in the AI era.”&nbsp;</p> <p><strong>AI Search Grader takes marketers from SEO to LMO</strong></p> <p>Today, marketers are manually testing how their brands are showing up in AI search: visiting different experiences, crafting specialised prompts, aggregating responses, and synthesising results – all just to get a fraction of the information they need. And there’s little guidance to help them act on what they find.</p> <p>HubSpot’s AI Search Grader eliminates the need for AI expertise, and takes the guesswork out of how a brand is showing up in AI search. It does prompt engineering for marketers and contextualises their brand’s performance to make LMO simple.&nbsp;</p> <p>AI Search Grader includes four key components: overall grade, brand sentiment score, share of voice score, and personalised analysis.</p> <p><a href="https://www.hubspot.com/ai-search-grader" target="_blank" rel="noopener">AI Search Grader</a>&nbsp;is free and globally available in English for any marketer to use.</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/d91f373d7922e7ebe32cc87bab104a2b_S.jpg" alt="From SEO to LMO: HubSpot launches the first free tool for AI discovery" /></div><div class="K2FeedIntroText"><p>PRODUCT ANNOUNCEMENT:&nbsp; AI Search Grader analyses brand awareness and sentiment across AI chatbots as more customers turn to tools like ChatGPT for search</p> </div><div class="K2FeedFullText"> <p>AI Search Grader analyses brand awareness and sentiment across AI chatbots as more customers turn to tools like ChatGPT for search</p> <p>Customers and prospects are ditching traditional search in favour of researching with AI for quick, personalised answers. In fact, usage of tools like ChatGPT to answer questions is up 37%, while search engines are down 11%.* For marketers who’ve been focused on getting their brands discovered with SEO, now is the time to master a new craft: Language Model Optimisation (LMO).</p> <p>At HubSpot, we’ve always given marketers the tools they need to connect with their customers and drive growth. We did it with our Website Grader that helped millions of marketers optimise for search engines. And we’re doing it today with the launch of AI Search Grader, the first free tool that helps brands understand how they show up in Large Language Models (LLM) and AI search.</p> <p>“What marketers have been doing for years to attract customers won’t be as effective in the future. People just aren’t using traditional search like they used to,”<strong>&nbsp;said Nicholas Holland, VP of Product and GM of Marketing Hub at <a href="https://www.hubspot.com/" target="_blank" rel="noopener">HubSpot</a>.&nbsp;</strong>“As the tides are turning toward AI search, we want to help marketers stand out and get discovered in this new environment. That’s exactly what AI Search Grader will give marketers – a new, free way to connect with their customers in the AI era.”&nbsp;</p> <p><strong>AI Search Grader takes marketers from SEO to LMO</strong></p> <p>Today, marketers are manually testing how their brands are showing up in AI search: visiting different experiences, crafting specialised prompts, aggregating responses, and synthesising results – all just to get a fraction of the information they need. And there’s little guidance to help them act on what they find.</p> <p>HubSpot’s AI Search Grader eliminates the need for AI expertise, and takes the guesswork out of how a brand is showing up in AI search. It does prompt engineering for marketers and contextualises their brand’s performance to make LMO simple.&nbsp;</p> <p>AI Search Grader includes four key components: overall grade, brand sentiment score, share of voice score, and personalised analysis.</p> <p><a href="https://www.hubspot.com/ai-search-grader" target="_blank" rel="noopener">AI Search Grader</a>&nbsp;is free and globally available in English for any marketer to use.</p></div> INTERVIEW - From Sydney to Silicon Valley: HP’s David McQuarrie's Top Tips to Global Tech Leadership 2024-08-12T21:25:58+10:00 2024-08-12T21:25:58+10:00 https://itwire.com/business-it-news/data/from-sydney-to-silicon-valley-hp%E2%80%99s-david-mcquarrie-s-top-tips-to-global-tech-leadership.html Team Writer stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/2f5a4f0035456afc0380bfc3ca1062c4_S.jpg" alt="INTERVIEW - From Sydney to Silicon Valley: HP’s David McQuarrie&#039;s Top Tips to Global Tech Leadership" /></div><div class="K2FeedIntroText"><p>In an exclusive interview with ITWire, David McQuarrie, HP's Global Chief Commercial Officer, shared invaluable lessons from his extraordinary journey across the globe.</p> </div><div class="K2FeedFullText"> <p>Hailing from Australia, McQuarrie’s career has evolved from regional roles in Europe to influential positions in the global boardrooms of HP in the United States. His story underscores the importance of diverse experiences in shaping and transforming global&nbsp;leadership in the dynamic tech industry.</p> <p><strong>Tip 1: Embrace Creativity and Perseverance</strong><br />Growing up in Sydney, McQuarrie was deeply influenced by the values his homeland embodies.</p> <p>“Having lived and worked in a lot of places in the world, the ability to be imaginative is something Australians are good at,” McQuarrie said. “They’ve had to be resilient and ingenious in solving problems.”</p> <p>These strengths have built a strong foundation for his career and are invaluable to companies in the current tech landscape. McQuarrie’s cultural blend of creativity and perseverance became his crucial strength as he navigated global workplaces across&nbsp;continents.</p> <p><strong>Tip 2: Foster Inclusivity</strong><br />Leading a diverse team at HP, McQuarrie emphasized the importance of inclusivity, a value that sits at the heart of the companys culture. “It feels like a family,” he said, reflecting on the strong connection between leadership and employees through decades of continuous transformation. This culture of inclusivity has strengthened internal trust and bolstered consumer confidence in HP.</p> <p>“It’s one of the things that attracted me to the company,” McQuarrie said. “It’s an organisation people have trusted since the very beginning.”</p> <p><strong>Tip 3: Trust is Crucial</strong><br />Trust is also a foundational cornerstone of HP’s approach to emerging technologies, including its growing AI strategy. Despite consumers concerns about AI, McQuarrie remains optimistic about its potential.</p> <p>“Our role is to ensure that we are providing the best, most secure, and most trusted technology, where people feel protected and empowered,” he said.</p> <p>He envisions AI as a transformative tool that will enable personalised working and living, enhance the HP hybrid experience and shape a future of growth and well-being.</p> <p>As HP continues to innovate, McQuarrie’s journey from Australia to a global leadership position offers an inspiring lesson in values, transformation, and trust in technology leadership.</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/2f5a4f0035456afc0380bfc3ca1062c4_S.jpg" alt="INTERVIEW - From Sydney to Silicon Valley: HP’s David McQuarrie&#039;s Top Tips to Global Tech Leadership" /></div><div class="K2FeedIntroText"><p>In an exclusive interview with ITWire, David McQuarrie, HP's Global Chief Commercial Officer, shared invaluable lessons from his extraordinary journey across the globe.</p> </div><div class="K2FeedFullText"> <p>Hailing from Australia, McQuarrie’s career has evolved from regional roles in Europe to influential positions in the global boardrooms of HP in the United States. His story underscores the importance of diverse experiences in shaping and transforming global&nbsp;leadership in the dynamic tech industry.</p> <p><strong>Tip 1: Embrace Creativity and Perseverance</strong><br />Growing up in Sydney, McQuarrie was deeply influenced by the values his homeland embodies.</p> <p>“Having lived and worked in a lot of places in the world, the ability to be imaginative is something Australians are good at,” McQuarrie said. “They’ve had to be resilient and ingenious in solving problems.”</p> <p>These strengths have built a strong foundation for his career and are invaluable to companies in the current tech landscape. McQuarrie’s cultural blend of creativity and perseverance became his crucial strength as he navigated global workplaces across&nbsp;continents.</p> <p><strong>Tip 2: Foster Inclusivity</strong><br />Leading a diverse team at HP, McQuarrie emphasized the importance of inclusivity, a value that sits at the heart of the companys culture. “It feels like a family,” he said, reflecting on the strong connection between leadership and employees through decades of continuous transformation. This culture of inclusivity has strengthened internal trust and bolstered consumer confidence in HP.</p> <p>“It’s one of the things that attracted me to the company,” McQuarrie said. “It’s an organisation people have trusted since the very beginning.”</p> <p><strong>Tip 3: Trust is Crucial</strong><br />Trust is also a foundational cornerstone of HP’s approach to emerging technologies, including its growing AI strategy. Despite consumers concerns about AI, McQuarrie remains optimistic about its potential.</p> <p>“Our role is to ensure that we are providing the best, most secure, and most trusted technology, where people feel protected and empowered,” he said.</p> <p>He envisions AI as a transformative tool that will enable personalised working and living, enhance the HP hybrid experience and shape a future of growth and well-being.</p> <p>As HP continues to innovate, McQuarrie’s journey from Australia to a global leadership position offers an inspiring lesson in values, transformation, and trust in technology leadership.</p></div> An AI strategy isn't enough; you need prepared data, and Cloudera says it has the platform to help you succeed 2024-08-07T19:17:41+10:00 2024-08-07T19:17:41+10:00 https://itwire.com/business-it-news/data/an-ai-strategy-isn-t-enough-you-need-prepared-data-and-cloudera-says-it-has-the-platform-to-help-you-succeed.html David M Williams stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/4562626d25f3b5b7ac4601f079641613_S.jpg" alt="An AI strategy isn&#039;t enough; you need prepared data, and Cloudera says it has the platform to help you succeed" /></div><div class="K2FeedIntroText"><p>Hybrid data, analytics, and AI platform Cloudera says AI will add about $US15.7T - that's trillion - to the global economy by 2030. To get there your data needs to be in order, and Cloudera says it has the best platform to help you succeed, and not be left behind.</p> </div><div class="K2FeedFullText"> <p>Cloudera CEO <a href="https://www.linkedin.com/in/charles-sansbury-90b462a/" target="_blank" rel="nofollow noopener">Charles Sansbury</a> and other executives took to the stage at <a href="https://www.cloudera.com/events/evolve/singapore.html" target="_blank" rel="nofollow noopener">Cloudera Evolve 2024 APAC</a>&nbsp;being held in Singapore this week, to announce new product offerings, share customer stories, and talk about the company's growth.</p> <p>A continual, recurring theme of the event was how vast the Cloudera customer base is, and the enormous scale of data being managed by the platform. In fact, Sansbury said, the company brings in more than a billion USD in revenue, is used across the top global 2000 customers including eight out of 10 banks, and has over 25 exabytes of data under management.</p> <p>"That's more data than any other company in the world," Sansbury said, pictured below.</p> <p><img src="https://itwire.com/images/authors-images/davidmwilliams/CharlesSansburyCloudian.jpeg" alt="CharlesSansburyCloudian" width="300" height="300" style="display: block; margin-left: auto; margin-right: auto;" /></p> <p>{loadposition david08}</p> <p>An exabyte is 1,000,000,000,000,000,000 bytes. We can relate to a terabyte - that's 1000 gigabytes, which in turn is 1000 megabytes. We have disk drives measured in terabytes now. Well, 1000 terabytes is called a petabyte. And an exabyte is 1000 petabytes. Some people say <a href="https://www.backblaze.com/blog/what-is-an-exabyte/" target="_blank" rel="nofollow noopener">all the words ever spoken by humanity</a> would sum to 5 exabytes. And Cloudera manages more than five times that much.</p> <p>It's so much data that Sansbury says some of Cloudera's customers - a subset of that 25 exabytes - have more data individually than all the data contained by some of the public cloud hyperscalers across all their customers combined.</p> <p>Obviously then, Cloudera is not itself using those hyperscalers as its backbone for data. Well, not completely. What's going on here is that Cloudera is a hybrid platform - in fact, Sansbury says it is the only true hybrid platform - working with customer data wherever it may be. Some data may be on-premises, some on this cloud, some on that cloud. No matter where your data is, Cloudera believes you should own and govern your own data, not vendors. Its products and tools will seamlessly accomodate. As such, Cloudera can be <a href="https://www.cloudera.com/downloads.html" target="_blank" rel="nofollow noopener">deployed in your own environment</a> or on a <a href="https://docs.cloudera.com/cdp-public-cloud/cloud/overview/topics/cdp-public-cloud.html" target="_blank" rel="nofollow noopener">public cloud</a>, even multiple public clouds.</p> <p><img src="https://itwire.com/images/authors-images/davidmwilliams/cdppc-overview.png" alt="" width="650" height="399" style="display: block; margin-left: auto; margin-right: auto;" data-alt="cdppc overview" /></p> <p>Cloudera is a billion dollar company today, but it was not always the case. The business began life in June 2008 by a team of Google, Yahoo!, and Facebook engineers, and an Oracle database executive, later being joined by a Hadoop co-founder the following year. Originally, Cloudera offered a free product based on Hadoop with revenue based on support and consulting. This morphed into a commercial Hadoop distribution. From 2009 to 2011 the company received $US 70 million of investment funding. In 2014 Cloudera raised another $US 160m from investors and a whopping $US 740m from Intel in exchange for an 18% stake of the company. It is almost a history lesson in the rise of analytics and big data and the role Hadoop played at the time.</p> <p>Yet, the story changed as public cloud services emerged. As you might expect in the typical Silicon Valley journey, Cloudera went public in 2017 only to find its share price drop due to falling sales and stiff competition from public cloud services like AWS. Intel sold back its 18% ownership for less than half what it paid, $US 314m, in 2020.</p> <p>As Cloudera rose to the challenge of differentiating itself - and strategically foresaw the valuable role of open source table formats like Iceberg, as well as the shift from all-in on public cloud to hybrid multi-cloud. Cloudera proved to be prescient on these and in late 2021 went private after an all-cash acquisition by KKR and Clayton, Dubilier and Rice. Today, Cloudera remains as a public company, but states it has over one billion USD in revenue, and is profitable to the tune of hundreds of millions. Meanwhile, other prominent data platform products are now rapidly embracing Iceberg also, while businesses globally are seeking to reduce their cloud bills by moving workloads to more appropriate locations, which might include on-premises but could be other clouds. In short, the trends Cloudera foresaw have come to pass, and Cloudera finds itself in a strong position to assist organisations of all sizes in their data and AI journey, especially as a first mover in <a href="https://docs.cloudera.com/cdp-public-cloud/cloud/cdp-iceberg/topics/iceberg-in-cdp.html" target="_blank" rel="nofollow noopener">adopting Iceberg as central and native to its platform</a>.</p> <p>It's some distance from Cloudera's roots as sifting through huge stores of corporate information to a major player helping customers prepare their data for AI purposes, be they tightly regulated financial industries, or industries with significant security, privacy, and sovereignty requirements, or even simply any company that wants to be more efficient with its spending while still achieving its goals.</p> <p>In fact, one Cloudera customer who most definitely has security front-of-mind is the <a href="https://www.afp.gov.au/" target="_blank" rel="nofollow noopener">Australian Federal Police</a>, who were named Cloudera's ANZ Customer of the Year during the event.</p> <p>The drive to infuse, embed, and leverage AI is now so great that it is projected to add a huge <a href="https://www.pwc.com/gx/en/issues/data-and-analytics/publications/artificial-intelligence-study.html" target="_blank" rel="nofollow noopener">15.7 trillion US dollars to the global economy</a> by as soon as 2030.</p> <p>Yet, wanting to do AI is different from doing AI. And having data is most definitely different from having prepared and organised and governed data that's ready to use.</p> <p>Sansbury has previously likened the AI journey to a field; you want to plant crops but boy, there's a lot of work that goes before it. You must plow the field into all kinds of nice rows. But there's dirt to shift, there are rocks to remove, you may need to boost soil and nutrients. Finally, your field is prepared. Only then can you plant crops and expect great results.</p> <p>So too with data; Cloudera conducted research that formed the basis for its <a href="https://www.cloudera.com/campaign/the-state-of-enterprise-ai-and-modern-data-architecture.html" target="_blank" rel="nofollow noopener">state of enterprise AI and modern data architecture</a> report, revealing the top barriers to AI adoption presently are security and compliance concerns, lack of training and skills to manage AI, and the sheer cost of AI adoption itself.</p> <p><img src="https://itwire.com/images/authors-images/davidmwilliams/RemusLimCloudera.jpeg" alt="RemusLimCloudera" width="300" height="300" style="display: block; margin-left: auto; margin-right: auto;" /></p> <p>Even so, ignore AI at your peril. “The gap between businesses leveraging AI and those lagging behind is widening rapidly,” said Cloudera SVP APJ <a href="https://www.linkedin.com/in/remus-lim-48711a4/" target="_blank" rel="nofollow noopener">Remus Lim</a>, pictured above.</p> <p>Here's where Cloudera comes in; the product is no mere database but a platform that, using Sansbury's analogy, helps you plow your field and get your rows lined up ready for planting, by assisting you in ensuring a strong data foundation.</p> <p>“The need for a strong data foundation has never been clearer," Lim said. "AI is the key that unlocks the true potential of this valuable asset, but it is just the first step. In fact, our recent survey showed that 88% of enterprises are adopting AI in some capacity. However, organisations are seeking greater support in terms of data infrastructure and skills to operationalize their AI and access all of their data such that critical insights are trustworthy and unbiased; only then will the truly transformative impact of AI be seen.”</p> <p>And, by embracing hybrid multi-cloud and working with your data where it resides, Cloudera can eliminate much of the bulky cost of shifting data to, and storing it in, public clouds.</p> <p>"Cloudera is the only true hybrid platform for data analytics and AI. We enable global enterprises to use data to solve the impossible today," Lim said.</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/4562626d25f3b5b7ac4601f079641613_S.jpg" alt="An AI strategy isn&#039;t enough; you need prepared data, and Cloudera says it has the platform to help you succeed" /></div><div class="K2FeedIntroText"><p>Hybrid data, analytics, and AI platform Cloudera says AI will add about $US15.7T - that's trillion - to the global economy by 2030. To get there your data needs to be in order, and Cloudera says it has the best platform to help you succeed, and not be left behind.</p> </div><div class="K2FeedFullText"> <p>Cloudera CEO <a href="https://www.linkedin.com/in/charles-sansbury-90b462a/" target="_blank" rel="nofollow noopener">Charles Sansbury</a> and other executives took to the stage at <a href="https://www.cloudera.com/events/evolve/singapore.html" target="_blank" rel="nofollow noopener">Cloudera Evolve 2024 APAC</a>&nbsp;being held in Singapore this week, to announce new product offerings, share customer stories, and talk about the company's growth.</p> <p>A continual, recurring theme of the event was how vast the Cloudera customer base is, and the enormous scale of data being managed by the platform. In fact, Sansbury said, the company brings in more than a billion USD in revenue, is used across the top global 2000 customers including eight out of 10 banks, and has over 25 exabytes of data under management.</p> <p>"That's more data than any other company in the world," Sansbury said, pictured below.</p> <p><img src="https://itwire.com/images/authors-images/davidmwilliams/CharlesSansburyCloudian.jpeg" alt="CharlesSansburyCloudian" width="300" height="300" style="display: block; margin-left: auto; margin-right: auto;" /></p> <p>{loadposition david08}</p> <p>An exabyte is 1,000,000,000,000,000,000 bytes. We can relate to a terabyte - that's 1000 gigabytes, which in turn is 1000 megabytes. We have disk drives measured in terabytes now. Well, 1000 terabytes is called a petabyte. And an exabyte is 1000 petabytes. Some people say <a href="https://www.backblaze.com/blog/what-is-an-exabyte/" target="_blank" rel="nofollow noopener">all the words ever spoken by humanity</a> would sum to 5 exabytes. And Cloudera manages more than five times that much.</p> <p>It's so much data that Sansbury says some of Cloudera's customers - a subset of that 25 exabytes - have more data individually than all the data contained by some of the public cloud hyperscalers across all their customers combined.</p> <p>Obviously then, Cloudera is not itself using those hyperscalers as its backbone for data. Well, not completely. What's going on here is that Cloudera is a hybrid platform - in fact, Sansbury says it is the only true hybrid platform - working with customer data wherever it may be. Some data may be on-premises, some on this cloud, some on that cloud. No matter where your data is, Cloudera believes you should own and govern your own data, not vendors. Its products and tools will seamlessly accomodate. As such, Cloudera can be <a href="https://www.cloudera.com/downloads.html" target="_blank" rel="nofollow noopener">deployed in your own environment</a> or on a <a href="https://docs.cloudera.com/cdp-public-cloud/cloud/overview/topics/cdp-public-cloud.html" target="_blank" rel="nofollow noopener">public cloud</a>, even multiple public clouds.</p> <p><img src="https://itwire.com/images/authors-images/davidmwilliams/cdppc-overview.png" alt="" width="650" height="399" style="display: block; margin-left: auto; margin-right: auto;" data-alt="cdppc overview" /></p> <p>Cloudera is a billion dollar company today, but it was not always the case. The business began life in June 2008 by a team of Google, Yahoo!, and Facebook engineers, and an Oracle database executive, later being joined by a Hadoop co-founder the following year. Originally, Cloudera offered a free product based on Hadoop with revenue based on support and consulting. This morphed into a commercial Hadoop distribution. From 2009 to 2011 the company received $US 70 million of investment funding. In 2014 Cloudera raised another $US 160m from investors and a whopping $US 740m from Intel in exchange for an 18% stake of the company. It is almost a history lesson in the rise of analytics and big data and the role Hadoop played at the time.</p> <p>Yet, the story changed as public cloud services emerged. As you might expect in the typical Silicon Valley journey, Cloudera went public in 2017 only to find its share price drop due to falling sales and stiff competition from public cloud services like AWS. Intel sold back its 18% ownership for less than half what it paid, $US 314m, in 2020.</p> <p>As Cloudera rose to the challenge of differentiating itself - and strategically foresaw the valuable role of open source table formats like Iceberg, as well as the shift from all-in on public cloud to hybrid multi-cloud. Cloudera proved to be prescient on these and in late 2021 went private after an all-cash acquisition by KKR and Clayton, Dubilier and Rice. Today, Cloudera remains as a public company, but states it has over one billion USD in revenue, and is profitable to the tune of hundreds of millions. Meanwhile, other prominent data platform products are now rapidly embracing Iceberg also, while businesses globally are seeking to reduce their cloud bills by moving workloads to more appropriate locations, which might include on-premises but could be other clouds. In short, the trends Cloudera foresaw have come to pass, and Cloudera finds itself in a strong position to assist organisations of all sizes in their data and AI journey, especially as a first mover in <a href="https://docs.cloudera.com/cdp-public-cloud/cloud/cdp-iceberg/topics/iceberg-in-cdp.html" target="_blank" rel="nofollow noopener">adopting Iceberg as central and native to its platform</a>.</p> <p>It's some distance from Cloudera's roots as sifting through huge stores of corporate information to a major player helping customers prepare their data for AI purposes, be they tightly regulated financial industries, or industries with significant security, privacy, and sovereignty requirements, or even simply any company that wants to be more efficient with its spending while still achieving its goals.</p> <p>In fact, one Cloudera customer who most definitely has security front-of-mind is the <a href="https://www.afp.gov.au/" target="_blank" rel="nofollow noopener">Australian Federal Police</a>, who were named Cloudera's ANZ Customer of the Year during the event.</p> <p>The drive to infuse, embed, and leverage AI is now so great that it is projected to add a huge <a href="https://www.pwc.com/gx/en/issues/data-and-analytics/publications/artificial-intelligence-study.html" target="_blank" rel="nofollow noopener">15.7 trillion US dollars to the global economy</a> by as soon as 2030.</p> <p>Yet, wanting to do AI is different from doing AI. And having data is most definitely different from having prepared and organised and governed data that's ready to use.</p> <p>Sansbury has previously likened the AI journey to a field; you want to plant crops but boy, there's a lot of work that goes before it. You must plow the field into all kinds of nice rows. But there's dirt to shift, there are rocks to remove, you may need to boost soil and nutrients. Finally, your field is prepared. Only then can you plant crops and expect great results.</p> <p>So too with data; Cloudera conducted research that formed the basis for its <a href="https://www.cloudera.com/campaign/the-state-of-enterprise-ai-and-modern-data-architecture.html" target="_blank" rel="nofollow noopener">state of enterprise AI and modern data architecture</a> report, revealing the top barriers to AI adoption presently are security and compliance concerns, lack of training and skills to manage AI, and the sheer cost of AI adoption itself.</p> <p><img src="https://itwire.com/images/authors-images/davidmwilliams/RemusLimCloudera.jpeg" alt="RemusLimCloudera" width="300" height="300" style="display: block; margin-left: auto; margin-right: auto;" /></p> <p>Even so, ignore AI at your peril. “The gap between businesses leveraging AI and those lagging behind is widening rapidly,” said Cloudera SVP APJ <a href="https://www.linkedin.com/in/remus-lim-48711a4/" target="_blank" rel="nofollow noopener">Remus Lim</a>, pictured above.</p> <p>Here's where Cloudera comes in; the product is no mere database but a platform that, using Sansbury's analogy, helps you plow your field and get your rows lined up ready for planting, by assisting you in ensuring a strong data foundation.</p> <p>“The need for a strong data foundation has never been clearer," Lim said. "AI is the key that unlocks the true potential of this valuable asset, but it is just the first step. In fact, our recent survey showed that 88% of enterprises are adopting AI in some capacity. However, organisations are seeking greater support in terms of data infrastructure and skills to operationalize their AI and access all of their data such that critical insights are trustworthy and unbiased; only then will the truly transformative impact of AI be seen.”</p> <p>And, by embracing hybrid multi-cloud and working with your data where it resides, Cloudera can eliminate much of the bulky cost of shifting data to, and storing it in, public clouds.</p> <p>"Cloudera is the only true hybrid platform for data analytics and AI. We enable global enterprises to use data to solve the impossible today," Lim said.</p></div> Cognizant Moment: Powering the next generation of data-led, AI-fueled experiences at enterprise scale 2024-08-01T11:04:12+10:00 2024-08-01T11:04:12+10:00 https://itwire.com/business-it-news/data/cognizant-moment-powering-the-next-generation-of-data-led,-ai-fueled-experiences-at-enterprise-scale.html Cognizant stan.beer@itwire.com <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/a6c0955fcd0c5de23a39bc7f9be0b691_S.jpg" alt="Ben Wiener, global head of Cognizant Moment" /></div><div class="K2FeedIntroText"><p><em>New digital experience practice will harness AI for enhanced customer engagement and business innovation</em></p> <p>COMPANY NEWS: Cognizant announced the launch of Cognizant Moment™, the next evolution of the company's digital experience practice area, designed to help clients leverage the power of artificial intelligence (AI) to reimagine customer experience and engineer innovative strategies aimed at driving growth.</p> </div><div class="K2FeedFullText"> <p>The new practice builds on the digital experience expertise and solutions Cognizant has delivered for clients over the last 20+ years, and advanced through a series of key acquisitions in the digital experience space. Now, as the ways consumers interact with technology are shifting to include multi-modal experiences, Cognizant aims to give clients the tools and insights they need to drive differentiation, cultivate customer loyalty and become future-ready.</p> <p>Cognizant Moment™ is built on the foundation of intelligent ecosystem orchestration, a strategy that connects experiences as well as the underlying data, technology and operations across the entire enterprise ecosystem. This approach enables clients to leverage generative AI's content generation capabilities, along with human ingenuity, to innovate, differentiate and drive growth by informing and automating processes, and creating dynamic, hyper-personalised experiences for their customers.</p> <p>"Every experience comes down to a series of moments, seamlessly enabled by intuitive strategy, human-centred design and curated technology," said <em><strong>Ravi Kumar S, CEO</strong></em>, <strong>Cognizant.</strong> "Cognizant research shows that a majority of G2000 business decision makers believe that generative AI will help them create new products and services, and many of them are already using the technology to design or deliver them. We aim to meet the moment, as enterprises move to differentiate through experience, and enable them to implement a data and technology-driven approach to shape the customer journey, rather than relying on traditional marketing agencies."</p> <p>Cognizant Moment aims to address these challenges directly, with a suite of capabilities across multiple workstreams, including:</p> <ul> <li>Intelligent Ecosystem Orchestration: Connecting data, technology and operations to create a dynamic experience ecosystem;</li> <li>Business Transformation: Innovating and reimagining operations to assist clients in generating new growth and economic value;</li> <li>Digital Products &amp; Platforms: Using human-centric agile methods to deliver products that resonate with customers and employees;</li> <li>Marketing &amp; Content: Transitioning from manual management to AI-driven personalised marketing, supported by comprehensive content services;</li> <li>Commerce: Moving beyond transactions to create immersive commerce experiences;</li> <li>Learning: Crafting interactive experiences to enable behaviour change and skill-based performance.</li> </ul> <p>"Cognizant Moment is one of the few modern experience practices that can deliver at scale, globally, and with the cutting-edge technical expertise required to effect the needed change," said <em><strong>Ben Wiener, global head of Cognizant Moment.</strong></em> "As experience leaders grapple with a confluence of technological and behavioural forces, we aim to help clients change the game and equip them to grow and thrive competitively, backed by Cognizant's long heritage of digital experience domain knowledge, technology and industry expertise."</p> <p>"The effectiveness of technology implementations must be measured by the experiences of both customers and employees," said <em><strong>Phil Fersht, CEO and Chief Analyst, HFS Research.</strong> </em>"Cognizant Moment brings the firm's best assets and talent together to maximise AI impact and people value."</p> <p>According to *New Work, New World*, a research report published by Cognizant analysing data from Oxford Economics, generative AI could inject an additional 3.5% of productivity growth to the US economy by 2032, estimated at $1 trillion per year. Additionally, by the same year, up to 90 percent of jobs could be disrupted in some way by generative AI, from administrative assistants to CXOs. By helping clients drive productivity gains with generative AI and changing the way marketers connect with customers, Cognizant Moment aims to play a central role in shaping the future of customer experience.</p> <p><em><strong>To learn more about Cognizant Moment, visit <b><a href="https://c212.net/c/link/?t=0&amp;l=en&amp;o=4220832-1&amp;h=1537274141&amp;u=https%3A%2F%2Fwww.cognizant.com%2Fmoment&amp;a=this+page" target="_blank" rel="noopener" id="m_8999419379895097353m_-624331475106487335m_6706975617403465649OWA5e6d6c52-fe4c-8ae5-573b-c88f7f8161db" data-saferedirecturl="https://www.google.com/url?q=https://c212.net/c/link/?t%3D0%26l%3Den%26o%3D4220832-1%26h%3D1537274141%26u%3Dhttps%253A%252F%252Fwww.cognizant.com%252Fmoment%26a%3Dthis%2Bpage&amp;source=gmail&amp;ust=1722554466040000&amp;usg=AOvVaw1N0PC4gGXTBriGLOSjmnFt">this page</a></b>.</strong></em></p> <p><strong>About Cognizant</strong><br />Cognizant (Nasdaq: CTSH) engineers modern businesses. We help our clients modernize technology, reimagine processes and transform experiences so they can stay ahead in our fast-changing world. Together, we're improving everyday life. See how at<strong><em> <a href="https://c212.net/c/link/?t=0&amp;l=en&amp;o=3972998-1&amp;h=3008181381&amp;u=https%3A%2F%2Fwww.cognizant.com%2F&amp;a=www.cognizant.com" target="_blank" rel="noopener" id="m_8999419379895097353m_-624331475106487335m_6706975617403465649OWA7717ca35-625f-b36c-0a8f-5e2106c939dd" data-saferedirecturl="https://www.google.com/url?q=https://c212.net/c/link/?t%3D0%26l%3Den%26o%3D3972998-1%26h%3D3008181381%26u%3Dhttps%253A%252F%252Fwww.cognizant.com%252F%26a%3Dwww.cognizant.com&amp;source=gmail&amp;ust=1722554466040000&amp;usg=AOvVaw0btaP0zt5WmLKugphwykGi">www.cognizant.com</a></em></strong>&nbsp;or @cognizant.</p></div> <div class="K2FeedImage"><img src="https://itwire.com/media/k2/items/cache/a6c0955fcd0c5de23a39bc7f9be0b691_S.jpg" alt="Ben Wiener, global head of Cognizant Moment" /></div><div class="K2FeedIntroText"><p><em>New digital experience practice will harness AI for enhanced customer engagement and business innovation</em></p> <p>COMPANY NEWS: Cognizant announced the launch of Cognizant Moment™, the next evolution of the company's digital experience practice area, designed to help clients leverage the power of artificial intelligence (AI) to reimagine customer experience and engineer innovative strategies aimed at driving growth.</p> </div><div class="K2FeedFullText"> <p>The new practice builds on the digital experience expertise and solutions Cognizant has delivered for clients over the last 20+ years, and advanced through a series of key acquisitions in the digital experience space. Now, as the ways consumers interact with technology are shifting to include multi-modal experiences, Cognizant aims to give clients the tools and insights they need to drive differentiation, cultivate customer loyalty and become future-ready.</p> <p>Cognizant Moment™ is built on the foundation of intelligent ecosystem orchestration, a strategy that connects experiences as well as the underlying data, technology and operations across the entire enterprise ecosystem. This approach enables clients to leverage generative AI's content generation capabilities, along with human ingenuity, to innovate, differentiate and drive growth by informing and automating processes, and creating dynamic, hyper-personalised experiences for their customers.</p> <p>"Every experience comes down to a series of moments, seamlessly enabled by intuitive strategy, human-centred design and curated technology," said <em><strong>Ravi Kumar S, CEO</strong></em>, <strong>Cognizant.</strong> "Cognizant research shows that a majority of G2000 business decision makers believe that generative AI will help them create new products and services, and many of them are already using the technology to design or deliver them. We aim to meet the moment, as enterprises move to differentiate through experience, and enable them to implement a data and technology-driven approach to shape the customer journey, rather than relying on traditional marketing agencies."</p> <p>Cognizant Moment aims to address these challenges directly, with a suite of capabilities across multiple workstreams, including:</p> <ul> <li>Intelligent Ecosystem Orchestration: Connecting data, technology and operations to create a dynamic experience ecosystem;</li> <li>Business Transformation: Innovating and reimagining operations to assist clients in generating new growth and economic value;</li> <li>Digital Products &amp; Platforms: Using human-centric agile methods to deliver products that resonate with customers and employees;</li> <li>Marketing &amp; Content: Transitioning from manual management to AI-driven personalised marketing, supported by comprehensive content services;</li> <li>Commerce: Moving beyond transactions to create immersive commerce experiences;</li> <li>Learning: Crafting interactive experiences to enable behaviour change and skill-based performance.</li> </ul> <p>"Cognizant Moment is one of the few modern experience practices that can deliver at scale, globally, and with the cutting-edge technical expertise required to effect the needed change," said <em><strong>Ben Wiener, global head of Cognizant Moment.</strong></em> "As experience leaders grapple with a confluence of technological and behavioural forces, we aim to help clients change the game and equip them to grow and thrive competitively, backed by Cognizant's long heritage of digital experience domain knowledge, technology and industry expertise."</p> <p>"The effectiveness of technology implementations must be measured by the experiences of both customers and employees," said <em><strong>Phil Fersht, CEO and Chief Analyst, HFS Research.</strong> </em>"Cognizant Moment brings the firm's best assets and talent together to maximise AI impact and people value."</p> <p>According to *New Work, New World*, a research report published by Cognizant analysing data from Oxford Economics, generative AI could inject an additional 3.5% of productivity growth to the US economy by 2032, estimated at $1 trillion per year. Additionally, by the same year, up to 90 percent of jobs could be disrupted in some way by generative AI, from administrative assistants to CXOs. By helping clients drive productivity gains with generative AI and changing the way marketers connect with customers, Cognizant Moment aims to play a central role in shaping the future of customer experience.</p> <p><em><strong>To learn more about Cognizant Moment, visit <b><a href="https://c212.net/c/link/?t=0&amp;l=en&amp;o=4220832-1&amp;h=1537274141&amp;u=https%3A%2F%2Fwww.cognizant.com%2Fmoment&amp;a=this+page" target="_blank" rel="noopener" id="m_8999419379895097353m_-624331475106487335m_6706975617403465649OWA5e6d6c52-fe4c-8ae5-573b-c88f7f8161db" data-saferedirecturl="https://www.google.com/url?q=https://c212.net/c/link/?t%3D0%26l%3Den%26o%3D4220832-1%26h%3D1537274141%26u%3Dhttps%253A%252F%252Fwww.cognizant.com%252Fmoment%26a%3Dthis%2Bpage&amp;source=gmail&amp;ust=1722554466040000&amp;usg=AOvVaw1N0PC4gGXTBriGLOSjmnFt">this page</a></b>.</strong></em></p> <p><strong>About Cognizant</strong><br />Cognizant (Nasdaq: CTSH) engineers modern businesses. We help our clients modernize technology, reimagine processes and transform experiences so they can stay ahead in our fast-changing world. Together, we're improving everyday life. See how at<strong><em> <a href="https://c212.net/c/link/?t=0&amp;l=en&amp;o=3972998-1&amp;h=3008181381&amp;u=https%3A%2F%2Fwww.cognizant.com%2F&amp;a=www.cognizant.com" target="_blank" rel="noopener" id="m_8999419379895097353m_-624331475106487335m_6706975617403465649OWA7717ca35-625f-b36c-0a8f-5e2106c939dd" data-saferedirecturl="https://www.google.com/url?q=https://c212.net/c/link/?t%3D0%26l%3Den%26o%3D3972998-1%26h%3D3008181381%26u%3Dhttps%253A%252F%252Fwww.cognizant.com%252F%26a%3Dwww.cognizant.com&amp;source=gmail&amp;ust=1722554466040000&amp;usg=AOvVaw0btaP0zt5WmLKugphwykGi">www.cognizant.com</a></em></strong>&nbsp;or @cognizant.</p></div>