iTWire - Data Analytics & Artificial Intelligence https://itwire.com Thu, 12 Sep 2024 18:24:19 +1000 Joomla! - Open Source Content Management en-gb The modern world of data and AI needs a new approach to data warehousing explains Teradata CEO https://itwire.com/business-it-news/data/the-modern-world-of-data-and-ai-needs-a-new-approach-to-data-warehousing-explains-teradata-ceo.html https://itwire.com/business-it-news/data/the-modern-world-of-data-and-ai-needs-a-new-approach-to-data-warehousing-explains-teradata-ceo.html The modern world of data and AI needs a new approach to data warehousing explains Teradata CEO

Although Teradata has almost a half-century of on-prem data warehousing experience, it's a cloud-first business today and is bringing its considerable experience to help its customers modernise. Teradata CEO Steve McMillan spoke to iTWire about growth, AI, and the changing nature of data.

Teradata has been in the business of data warehousing so long, the term "data warehouse" didn't even exist back then, some four plus decades ago. Today Teradata is still a massive data company, but when CEO Steve McMillan took the helm in June 2020 he declard it would be a cloud-first business. That journey has paid dividends, with hundreds and hundreds of customers now part of its cloud business.

Teradata sees half a billion dollar in annual recurring revenue (ARR) through its cloud arm, and is growing over 30% year on year.

Major organisations are dependent on Teradata, and the cloud-first strategy "has given us the opportunity to demonstrate our Teradata platform can run mission critical workloads successfully in the cloud," McMillan said. "The success has built up a head of steam, and enabled us to demonstrate our maturity in this space."

{loadposition david08}

Australia is a significant part of the Teradata story. "I'm really proud of the Australian team," he said. "The majority of our customers in Australia are on a modernisation journey with Teradata."

Now, you don't need to read too far in the news these days to find companies talking about cloud and modernisation. However, when it comes to Teradata these words are significant. Teradata customers are huge; the business is known for its on-premises strength since 1979. Finance, retail, telco, government, and other major industries live on Teradata with vast mountains of data. To help these organisations shift into hybrid operations with a cloud component is no mean feat, and is not without challenges of inertia. These are not overnight projects; these have complex rules around governance, providence, sovereignty, and more.

Yet, Teradata has made it happen. "Over 70% of our customers are operating in a hybrid environment today," McMillan said. "Still a lot is on-premises, but data sets are moving or have moved to the cloud."

What drives this is the desire to get the best out of that data. "Enterprise data warehousing has evolved over the last 30 to 40 years of taking data from all kinds of silos and putting it into one single store," he said. "But data has gravity."

"We have a capability called QueryGrid that enables customers to move a query to the data as opposed to moving all the data to the query. With QueryGrid we can have a Teradata ecosystem in Azure, in AWS, on-prem, and the QueryGrid enables you to send the query to the appropriate Teradata ecosystem sending back results without moving lots of data."

This is the way of modern companies. "We've found customers have a whole range of workloads, some of which require highly advanced dedicated storage mechanisms with Teradata on top to run super-complex workloads."

In fact, today's customers most likely "have 10x the data in native object stores than in structured data warehouses."

A major factor pushing companies to consider their data strategies is the real power of artificial intelligence, as seen in machine learning and generative AI, both made viable by the sheer power of today's compute engines.

"A Gartner study said over 90% of ANZ CIOs would have AI implementations in 2026," McMillan said. He's seeing huge growth in the Teradata platform being used from that AI perspective.

In fact, back in August 2022 - "just before the ChatGPT craziness in November 2022" - Teradata launched new AI and ML capabilies named ClearScape. "We've been helping customers take advantage of AI, and now GenAI, through the Teradata platform. We've seen super-interesting answers driving good use cases from Teradata customers," he said.

And over the last 12 to 18 months McMillan has definitely observed a maturing in AI projects. He's observed three important characteristics that measure if a project will be successful or not. This is valuable information from a global CEO, gleaned from the experiences of major organisations. These are:

  1. The project must ensure trust. "Make sure it's training with trusted data inside your ecosystem to have more assurity over the outputs," McMillan said. Even if you don't want to have generative AI talk directly to your customers, you can still have it augment data for your staff. "We're seeing human-centric GenAI solutions that empower people inside the organisation, giving advanced capabilities."
  2. The generative AI models must be ethical. "Trusted and ethical are super important," McMillan said. You must make sure no bias is introduced. "Some AIs are put in place with implicit bias in them. For example, against certain members of the population, either by economic stature or racial stature, or other. We make sure we can help customers with those ethical solutions. For example, you can have a capability to run A/B comparisons for advanced models."
  3. The project must be sustainable. It has to have some green credentials; there's no point having a super advanced AI that costs so much to run it's taking your business backwards. Here Teradata can bring all its expertise to bear. "We help implement AI that can be cost effective, and cost efficient, leveraging Teradata technology to optimise language models," McMillan said.

These are the three factors that successful AI projects have in common. However, you might still be wondering what would be a good use case for AI in your company, in the first place. You know AI is the key to getting ahead, to unlocking the hidden value of your data to speed up customer service, to accelerate time to market, to create content quickly, and more. But what does this mean in a practical sense, in your specific situation? Again, McMillan has three observations from his experience. "These are the three horizons in terms of customers utilising AI," he said.

  1. Improve the efficiency and effectiveness of people inside your company. This might be achieved by using commercially available AI features in tools you already use, such as the various Copilots being released.
  2. Embed AI, such as generative AI, into your product or services. In this way you help your customers gain efficiency and effectiveness.
  3. Identify a way to use AI to transform your industry. An example McMillan provides is that of Unilever; the chief data officer for Unilever told him how merchaniders would speak with buyers like Walmart to negotiate over the price of a bar of soap, for example. Now Unilever has developed advanced models in terms of demand and supply to help augment the merchandiser with information, while Walmart has done the same for its buyers. "It's like how high-frequency trading transformed the stock exchange. It moves to a battle of models. We find that really interesting," he said.

McMillan adds, "our approach in Teradata is not to lock a company into one kind of model. We believe in the future companies won't just use large language models with trillions of parameters, but will use small and medium language models with much smaller parameters but far more specialised in terms of results that come out."

"This will ensure better data quality, minimised hallucinations, bias removed, and better output sets."

]]>
stan.beer@itwire.com (David M Williams) Data Mon, 09 Sep 2024 22:45:09 +1000
Queensland Government taps Databricks for new Data and AI Academy https://itwire.com/business-it-news/data/queensland-government-taps-databricks-for-new-data-and-ai-academy.html https://itwire.com/business-it-news/data/queensland-government-taps-databricks-for-new-data-and-ai-academy.html Adam Beavis, Vice President and General Manager for Databricks in ANZ.

Data and AI company, Databricks has launched a Data and AI Academy for Queensland State Government agencies, designed to expedite data and AI training and enhance the capabilities of Australian public sector staff.

DataBricks says that upon completing the courses, participants will be able to obtain the necessary Databricks accreditations and certifications.

“Databricks Data and AI Academy will upskill more than 100 Queensland Government staff on data, analytics, and AI, helping them to become more proficient with innovative technology to improve productivity and achieve better results,” notes Databricks.

“The strategic programme will also allow them to then train their colleagues, empowering public sector staff to better leverage the suite of data and AI capabilities at their disposal through the Databricks Platform.”

{loadposition peter}

“Our organisation’s digital transformation is well underway, yet a recent review of our data and AI capabilities revealed a skills gap when it came to deploying certain functionalities. Databricks' leading Data and AI Academy offered us the opportunity to empower our staff and ensure skills uplift within our workforce, owing to the programme's inherent knowledge sharing feature,” said Peter How, General Manager, Innovation & Delivery, National Injury Insurance Scheme, Queensland.

“The learnings from the program will inform and fuel our organisation's capabilities for years to come, enabling our teams to co-design and provide improved services for our participants, external stakeholders and internal staff”.

How notes that the Data and AI Academy also promotes citizen data capabilities by automating complex preprocessing, engineering, and model training processes, enabling users to easily build, train, and deploy their own models through a low-code approach.

“Key to our department’s data strategy is the data and digital literacy of our staff, in a rapidly changing environment. The Data and AI Academy is just what we have been looking for as we face skills gaps across the department in leveraging data and AI processing capabilities and mounting implementations,” said Damon Atzeni, Director of Data and Analytics, Queensland Health.

”The Academy is our fast track to understanding the power of the solution we have and developing the skills to use it effectively. The program has enabled us to self-manage deployments in reforming our health data, resulting in greater administrative efficiency and patient care insights.”

“Between Australia’s tight talent market and the quick rise and rapid evolution of AI capabilities, many organisations understand they must double down on training efforts to run best-in-class operations,” said Adam Beavis, Vice President and General Manager for Databricks in ANZ.

“Our leading data and AI academy programme uniquely positions us to assist and guide institutions like the NIISQ and Queensland Health in upskilling staff to accelerate the development of different use cases and their implementation to further innovation.”

National Injury Insurance Scheme, Queensland and Queensland Health’s adoption of Databricks Data and AI Academy procedures follow Databricks audit of the essential state government departments’ data strategies and need for refined implementation strategies.

Databricks hopes to expand the programme to additional departments throughout the Queensland Government.

About Databricks
Databricks is the Data and AI company. More than 10,000 organisations worldwide — including Block, Comcast, Condé Nast, Rivian, Shell and over 60% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to take control of their data and put it to work with AI. Databricks is headquartered in San Francisco, with offices around the globe, and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on LinkedIn, X and Facebook.

]]>
stan.beer@itwire.com (Gordon Peters) Data Wed, 28 Aug 2024 11:11:49 +1000
Cloudian, Lenovo unveil AI data lake platform https://itwire.com/business-it-news/data/cloudian,-lenovo-unveil-ai-data-lake-platform.html https://itwire.com/business-it-news/data/cloudian,-lenovo-unveil-ai-data-lake-platform.html Cloudian managing director Asia Pacific and Japan James Wright

Software company Cloudian partnered with Lenovo to unveil the Cloudian HyperStore AI data lake platform.

Built on Lenovo ThinkSystem SR635 V3 all-flash servers with AMD EPYC 9454P processors, the new solution demonstrated performance of 28.7 GB/s reads and 18.4 GB/s writes from a cluster of six power-efficient, single-processor servers, delivering a 74% power efficiency improvement versus a HDD-based system in Cloudian testing.

Lenovo combines Cloudian’s AI-ready data platform software with its all-flash Lenovo ThinkSystem SR635 V3 servers and 4th Gen AMD EPYC processors to deliver data management solution for AI and data analytics.

“There’s a big focus on the AI boom in Australia, New Zealand and across APAC, and it’s easy to see why when bodies like the CSIRO say the Australian market alone could be worth close to $500 billion in the next few years,” says Cloudian managing director Asia Pacific and Japan James Wright (pictured).

{loadposition kenn}

“But there’s a storage and infrastructure layer that companies and government agencies need to power the data-hungry workloads central to AI’s performance and functionality. What’s out there now simply won’t cut it. Imagine trying to power the mobile applications we use today with the simple mobile phones we had 20 years ago – it wouldn’t work and it’s no different at the infrastructure level, particularly with AI in play.”

“For organisations looking to innovate or drive research and discovery with AI, ML, and HPC, this solution promises to be transformative,” says Cloudian CEO and co-founder Michael Tso.

Built for mission-critical, capacity-intensive workloads, the platform features exabyte scalability, S3 API compatibility, military-grade security, and Object Lock for ransomware protection.

“Combining Lenovo’s high-performance all-flash AMD EPYC CPU-based servers with Cloudian's AI data lake software creates a solution that can handle the most demanding AI and analytics workloads,” says Lenovo general manager Stuart McRae.

“This partnership enables us to offer our customers a cutting-edge, scalable, and secure platform that will help them accelerate their AI initiatives and drive innovation.”

“AI workloads demand a lot from storage. Our 4th Gen AMD EPYC processors together with Lenovo's ThinkSystem servers and Cloudian's AI data lake software deliver the performance and scalability that AI users need,” says AMD corporate vice president strategic business development Kumaran Siva. “The single socket, AMD EPYC CPU-based Lenovo ThinkSystem SR635 V3 platform provides outstanding throughput combined with excellent power and rack efficiency to accelerate AI innovation.”

]]>
stan.beer@itwire.com (Kenn Anthony Mendoza) Data Fri, 23 Aug 2024 08:24:55 +1000
Want GenAI? Then you need a document database for the best result says MongoDB exec https://itwire.com/business-it-news/data/want-genai-then-you-need-a-document-database-for-the-best-result-says-mongodb-exec.html https://itwire.com/business-it-news/data/want-genai-then-you-need-a-document-database-for-the-best-result-says-mongodb-exec.html Want GenAI? Then you need a document database for the best result says MongoDB exec

Generative AI is bringing vast business benefits from summarising documents, to helping with customer service, even aiding organisations in asking questions of complex systems in simple plain language. However, you might not be using the right tools for the job. MongoDB field CTO Rick Houlihan experimented and found vast performance gains when using a document database over a relational one.

In the world of databases, MongoDB is a leader in the 'no SQL' movement. Relational databases trace their roots to mathematical set theory, and its rules of relational algebra were set in stone by IBM and other researchers since the 1950s. Relational databases power sales, payrolls, inventory, flight schedules, and all kinds of enterprise purposes around the world.

Yet, in a modern world where text - and specifically, natural language - is becoming a major force, the relational database may simply not be the right choice.

"Third normal form is a great mathematical and logical representation of data," said MongoDB field CTO Rick Houlihan, referring to the relational model of using tables with linked fields to ensure a single set of master data without redundancy. "But it has a high time complexity to map data together."

{loadposition david08}

"MongoDB makes it easier to work with data. Our core database, the document database, was built to remove abstractions from data. A document database brings data out in a better way," he said. "We don't have to work with data in third normal form in apps."

In fact, the challenge of developers getting their heads around relational database mechanisms can often be a complex one. It's why, Houlihan notes, there's a whole slew of object relational-mapping (ORM) tools. One such example is the popular Entity Framework for .NET; such tools are used by developers to remove abstractions from relational databases. Houlihan says you simply don't have that fuss in a document database; it just works differently.

"We say just store the data how you use it. Make document structures that map to your access patterns," he said. "It's more efficient."

And, when it comes to generative AI - which institutions all around the world are working hard to find and pilot use cases for - the choice of database can make a huge difference.

Houlihan is more than willing to put his money where his mouth is. "I've always been a big fan of Grace Hopper who said 'one accurate measurement is worth a thousand expert opinions'" he said. Thus, earlier this year he tested for himself how well different databases could support generative AI with truly eye-opening results.

Using the exact same hardware, and with Postgres and MongoDB set up, with clearly stated configurations and parameters, Houlihan loaded single attributes and multiple attributes of increasing size into the databases. This replicates the type of data generative AI deals with; it's not about simple numeric order IDs or product SKUs or surnames and first names. Rather, GenAI is all about huge chunks of text - contracts, manuals, documentation. Even if the text is chunked it's still in blocks of 4Kb or more. It's a scenario that a document database excels at, and a relational database does not.

Houlihan's testing showed for small block sizes MongoDB and Postgres compared relatively evenly until the payload size ramped up. No matter if using Postgres JSON (a widely used data interchange format popular across many applications and technology stacks) or JSONB after a mere 200 bytes their processing time to insert data began increasing significantly. Meanwhile, MongoDB retained a reasonably linear insert time irrespective of the size of the data.

For example, to insert 200 attributes at 4000 bytes Postgres took 37.2s using JSONB and 17.5s using JSON, while MongoDB did the same work at fractionally over a mere second.

The read workload running against the same data took 53.8s and 27.8s in Postgres for JSONB and JSON respectively, against 8.4s in MongoDB.

Data types like JSON or VARIANT can help to shoe-horn large objects into relational databases, but the takeaway from Houlihan's experiments is clear. Relational databases suffer from performance limitations of wide rows and large data attributes, while a document database such as MongoDB takes them in its stride.

Relational databases are valuable, and important, of course, but Houlihan's message is you need to use the right tool for the right task. And, when it comes to generative AI, the right tool is a document database like MongoDB.

Houlian published his work in a GitHub repo for the world to see and for anyone to validate for themselves.

It's not simply an experiment; when it comes to MongoDB and GenAI "there are companies doing real things, with real impact," Houlihan said. He cites an example as Pathfinder, an organisation which uncovers cybercrime evidence, collates it, and then uses AI to find similarities and identify perpetrators of such evil as human trafficking and exploitation.

Another is Nova Nordisk leveraging MongoDB and Generative AI to improve health care and advance medical treatment for common diseases like diabetes and cancer. Reducing the time required to compile clinical research reports for regulatory approval of new pharmaceuticals from 12 weeks to just 10 minutes has empowered their business to do more with less.

Meanwhile, MongoDB has a big Australian connection, Houlihan explained. While MongoDB is available as a free product, it also comes as a managed service called Atlas. A new feature, Atlas Charts, uses natural language to easily visualise data and help end users self-service to get meaningful, actionable insights into their data without having to wait upon specialised BI developers being freed up.

Atlas Charts is the work of MongoDB's Sydney-based engineering team. This team has also been a big part of the company's Relational Migrator, a service that uses generative AI to help organisations migrate relational databases to MongoDB, among many other projects.

Houlihan previously spent time as the first technical product manager for AWS DocumentDB, building a NoSQL centre of excellence there. Now he is at MongoDB and taking his big ideas further.

What attracted him to MongoDB was how the product "wraps functionality behind a unified API where devs don't have to learn five or six different tech stacks. We don't reinvent the wheel; we invest in the core service and then add the best-of-breed from the industry into the product." An example is Lucene; it's the most popular full-text search engine, and the same one that backs Elastic, among other popular products. "So, we built Lucene into our own API to reduce developer overhead in working with the data."

"On top of that," he said, "our founders had geographical data distribution in mind from the beginning. The relational database is not built with geographic distribution in mind. Layers can be added on top like log shipping for Postgres, or Golden Gate for Oracle, but it's often left for the developer to solve."

By contrast, "it's a first-class citizen in MongoDB. We combine the flexibility of the document model with the ability to determine on each individual write how the data should be replicated and what level of consistency is required. It's a really novel way of working with data and how to store and access it."

"We work with the largest financial institutions in the world. These companies run extremely high-velocity trading and payment processing applications; the tech we provide for those kinds of workloads drives an enormous amount of efficiency," Houlihan said.

]]>
stan.beer@itwire.com (David M Williams) Data Wed, 21 Aug 2024 23:42:18 +1000
How AI is Transforming the Banking Industry https://itwire.com/business-it-news/data/how-ai-is-transforming-the-banking-industry.html https://itwire.com/business-it-news/data/how-ai-is-transforming-the-banking-industry.html How AI is Transforming the Banking Industry

Artificial intelligence is changing the banking business, with major implications for both traditional and new banks. This shift in the banking industry from traditional, data-driven AI to sophisticated generative AI which offers unprecedented levels of efficiency and customer engagement. In the banking industry, generative AI has the potential to increase productivity by 5% and decrease worldwide spending by $300 billion, according to McKinsey's 2023 banking report. 

However, that only represents a portion of the overall picture. The integration of AI in banking apps and services has made the industry more customer-centric and digitally relevant. Because AI-based systems are more productive and make judgments based on data that is incomprehensible to humans, banks are now able to reduce costs. 

Furthermore, sophisticated algorithms are able to identify misleading data in a couple of seconds. This article will examine how AI is bringing changes that can help revolutionize the banking industry. 

How Banking Has Evolved With Artificial Intelligence

AI's role in banking started with data analysis and task automation, but it has since grown to include complex applications in risk management, fraud prevention, and customized customer care. The emergence of generative AI development, which can create and anticipate using vast amounts of data, is a significant breakthrough that could further alter banking operations and strategy. 

These are not the only uses of Artificial Intelligence in banking. Prioritizing risk management, process organization, and security has always been a top priority for traditional banks; yet, until recently, customer satisfaction and involvement have been lacking. 

Nearly 80% of banks, according to a Business Insider report, are aware of the possible advantages of artificial intelligence in banking. According to a different McKinsey analysis, artificial intelligence in banking and finance might reach $1 trillion in potential.

Applications Of AI In Banking

Our world has increasingly relied on artificial intelligence, and banks have already begun incorporating this technology into their offerings. The following are some significant uses of AI in the banking sector:

Fraud Detection and Cybersecurity

Artificial Intelligence in the banking sector is increasingly being used to detect and manage cyber threats in the banking industry. By utilizing machine learning and AI, banks can detect fraudulent activity, monitor system vulnerabilities, reduce risks, and enhance overall security. 

For instance, Denmark's largest bank, Danske Bank, has successfully detected fraud 50% more often and identified false positives 60% less frequently. AI also helps banks respond to potential intrusions before they impact staff, clients, or internal systems. This blockchain technology is crucial for enhancing the overall security of online banking.

Loans and Credit

Banks are using AI-based systems to make more informed, safer, and profitable loan and credit decisions. Currently, these systems rely on credit history, scores, and customer references to determine creditworthiness. 

However, these systems are often flawed due to errors and misclassification of creditors. AI-based systems can analyze customer behaviour and patterns, determine creditworthiness, and send warnings to banks about behaviours that may increase default chances. These technologies are significantly changing consumer lending.

Market Trends

AI-ML in financial services helps banks process large volumes of data and predict the latest market trends. Advanced machine learning techniques help evaluate market sentiments and suggest investment options.

AI solutions for banking also suggest the best time to invest in stocks and warn when there is a potential risk. Due to its high data processing capacity, this emerging technology also helps speed up decision-making and makes trading convenient for banks and their clients.

Chatbots

Artificial intelligence is used in banking practically with chatbots, which are available around the clock and can recognize consumer behaviour patterns. Chatbots can be integrated into banking apps to provide individualized support, lessen workloads, and recommend appropriate financial services. 

In 2019, Erica, a virtual assistant for Bank of America, efficiently managed more than 50 million client requests, including credit card debt reduction and card security upgrades.

Data Collection and Analysis

Every day, banking and financial organizations log millions of transactions. Employees find it daunting to collect and register the massive amount of information generated. Organizing and recording such a large volume of data becomes impossible.

In these situations, creative AI development companies help collect and analyze data effectively. The data can also be utilized to determine creditworthiness or identify fraud.

Tracking Market Trends

In the financial services industry, AI-ML enables institutions to process massive amounts of data and forecast current market trends. Sophisticated machine learning methods aid in assessing market mood and making investment recommendations.

AI banking solutions also recommend the optimal time to buy equities and flag any dangers. This cutting-edge technology facilitates convenient trading for banks and their clients and helps expedite decision-making due to its high data processing capability.

Risk Management

The banking and financial industries are significantly impacted by external global events such as political upheaval, natural disasters, and currency changes. Using AI services in banking allows you to stay organized and make timely decisions by providing insights that paint a fairly clear picture of what's ahead.

By calculating the likelihood that a client won't repay a loan, AI for banking also assists in identifying riskier applications. It uses smartphone data and historical behavioural patterns to forecast future behaviour.

Customer Experience

ATMs are a popular way for people to deposit and withdraw money during non-working hours since customers are looking for better experiences and convenience more and more. As a result, artificial intelligence has been incorporated into banking and finance services, improving customer satisfaction and cutting down on the time needed to record Know Your Customer (KYC) data.

Additionally, AI technology automates credit qualifying and personal loan approval procedures, speeding up approval times and guaranteeing a positive client experience. Furthermore, AI in banking customer support reliably gathers user data for account setup.

Regulatory Compliance

Governments enforce strict regulations on the banking industry, ensuring banks don't commit financial crimes and maintain appropriate risk profiles to prevent significant defaults. Internal compliance teams are common in banks, but manual procedures are expensive and time-consuming. 

Compliance regulations are always changing and requiring adjustments. Deep learning, natural language processing (NLP), artificial intelligence (AI), and machine learning (ML) can enhance decision-making by interpreting new needs. Although AI cannot replace compliance analysts, it can speed up and improve the effectiveness of banking operations.

Predictive Analytics

Predictive analytics with broad application and general-purpose semantic and natural language applications are two of the most prevalent use cases of AI in the banking sector. Particular patterns and connections in the data can be found by AI that were previously undetectable by conventional technologies. 

These patterns might point to unexplored prospects for cross-selling or sales or even point to operational data indicators, which would directly affect revenue.

Process Automation

Robotic process automation (RPA) algorithms automate repetitive, time-consuming operations, increasing operational efficiency and accuracy while lowering costs. Additionally, it frees up users to concentrate on more intricate procedures involving human intervention.

Banking organizations are currently using RPA to improve efficiency and transaction speed. For instance, JPMorgan Chase's CoiN technology analyzes papers far more quickly than a human could and extracts data from them.

Challenges and Setbacks In Adopting AI in Banking

There are challenges in widely implementing cutting-edge technologies like artificial intelligence. Banks that use AI technologies face a number of difficulties, including security concerns and a shortage of reliable and high-quality data. Let's look at them now.

Data Security

The banking sector collects enormous amounts of data, so strong security measures are required to prevent breaches. To ensure that your client data is managed properly, it is crucial to choose the correct technology partner who is knowledgeable in both AI and banking and can provide a range of security choices.

Lack of Quality Data

Before implementing a comprehensive AI-based banking system, banks require high-quality, organized data for training and validation to guarantee that the method is applicable to real-world scenarios.

Furthermore, non-machine-readable data could cause unexpected behaviour from AI models. Banks must, therefore, update their data rules to reduce any privacy and regulatory issues as they deploy AI services.

Lack of Explainability

While AI-based decision support systems are useful, they may also display biases resulting from previous problems with human judgment. To avoid these kinds of problems, banks should guarantee that all AI-generated choices and recommendations can be adequately explained, validated, and shown to follow the model's decision-making process.

Conclusion

Artificial Intelligence in banking is causing a major shift in how banks function, provide services, and interact with clients. Through process optimization, improved client experiences, risk mitigation, and data-driven decision-making, artificial intelligence (AI) is helping banks prepare themselves for success in the digital era.

Banks will obtain a competitive edge, increase operational effectiveness, and provide new, tailored services that cater to their clients' changing demands if they use AI and incorporate it into their strategy.

Banks must, however, approach the deployment of AI with a responsible and strategic mentality, addressing issues with data protection, moral AI development services, and employment effects. Banks can fully utilize the revolutionary potential of artificial intelligence and influence the direction of the financial services sector by finding the ideal balance between innovation and responsible AI adoption.

]]>
stan.beer@itwire.com (Guest Writer) Data Wed, 21 Aug 2024 11:50:52 +1000
5 ways to encourage more use of data in your company https://itwire.com/business-it-news/data/5-ways-to-encourage-more-use-of-data-in-your-company.html https://itwire.com/business-it-news/data/5-ways-to-encourage-more-use-of-data-in-your-company.html 5 ways to encourage more use of data in your company

GUEST OPINION: Data generation has accelerated in recent years with the rise of faster internet connectivity and generative AI.

Despite the rapid generation, gathering, and accumulation of data, it appears most enterprises are not making good use of the information available to them. One study shows that around 68% of enterprise data remains unutilized.

Organizations store massive amounts of information, but they are not leveraging it to stimulate innovation, improve products and services, and enhance marketing and customer relations. If this is going to change, then everyone in an organization will have a role to play, as detailed in the following approaches.

1. Elimination of information silos

Information silos occur when data or knowledge is kept isolated or made exclusively available to certain teams, departments, or systems.

While silos may be purposely created for security or privacy reasons, in most instances, they just emerge because of poor data handling or outdated cultures. Unintentional information silos must be dismantled to make data available to those who can extract value from them.

Organizations should carefully examine their legacy systems for potential silos. It is also advisable to review departmental information strategies and workplace cultures to address siloing and empower everyone to use data for productive ends.

Widening access to data safely can be challenging, so it helps to use solutions like Informatica, a comprehensive AI-powered data governance and access control solution that helps teams to manage data across multiple clouds and hybrid environments. It helps address data isolation and brings together disparate data systems with its data governing, cataloging, and integration capabilities.

2. Promoting a data-driven workplace culture

Implementing a culture of data use in an organization does not happen overnight. Merely directing employees to “use data” does not build a data-driven culture unto itself. It requires a systematic approach and consistency.

One of the key elements in successfully promoting a data-driven culture is leadership. Managers, supervisors, and others in leadership positions should demonstrate and encourage data-driven decision-making. It is important to lead by example, by regularly citing data during meetings and requiring reports to reference relevant metrics and qualitative information whenever applicable.

In addition to leadership commitment, it is important to establish a clear framework and process for data use. Relying on employee initiative is likewise not enough. It is advisable to communicate standards when it comes to data formats, relevance, accuracy, and understandability. Employees should be encouraged to learn to transform raw data into structured and contextualized information in line with the established standards.

Moreover, organizations should make the process of building a data-driven workplace culture a collaborative effort. Allowing employees to have a say on the framework or procedures helps to get them more committed to the goal.

It also helps to establish clear objectives, to easily monitor and quantify the progress toward building a data-driven culture. For example, managers can keep track of the use of relevant data in official documents or communications to check if employees are already building the habit of data use. Managers can also monitor employees’ proficiency in using relevant software tools.

3. Using intuitive AI-enhanced BI tools

With the advent of AI-powered business intelligence software, anyone can perform data analysis or conduct business intelligence without formal training. It’s easier than ever to help resolve problems or arrive at evidence-backed business decisions.

Organizations can invest in intuitive business intelligence software augmented by Natural Language Processing (NLP) and Large Language Models (LLM) that make them significantly easier to use but without being too rudimentary with their outputs.

A good example of an intuitive business intelligence solution is Pyramid Analytics, which brings together the advantages of advanced data analytics and generative artificial intelligence (GenAI) to produce generative business intelligence (GenBI). It allows anyone to interact with data without having in-depth knowledge about data querying and analysis. Users can ask Pyramid to generate insights or produce interactive data presentations through text or voice-based communication.

One challenge with using LLM-enhanced business intelligence solutions, however, is the potential for data leaks. As shown by what happened with ChatGPT, it is possible for the AI systems to reveal information used in training them or details they obtain as they interact with users. As such, it is advisable to carefully examine an LLM-aided BI tool. Preferably, it should not grant LLMs direct access to enterprise data.

Pyramid addresses this issue by adding a layer to the underlying tech, allowing its system to send queries to the LLM of the user’s choice without actually sharing any data with the LLM – all of the analysis and visualization itself happens within the safety of the installed software.

4. Providing data analysis and management courses

There is no excuse for being unable to provide line-of-business team members with training on how to utilize data for their own strategic purposes. There are many data training modules available online. Organizations can partner with online training platforms like Udemy.

Also, you can conduct your own custom training sessions and develop customized data analysis training for different departments to address specific needs. To encourage participation in the training, organizations can inject gamification elements or offer incentives.

Data analysis courses provide everyone in the organization with the foundation they need to have a workable understanding of using reports and dashboards. These courses also empower employees to contribute ideas in building the organization’s data use frameworks. The knowledge they gain from the courses allows them to examine, for example, if the requirements and processes result in greater efficiency or if they’re simply an unnecessary burden.

Before picking a course or developing custom training modules, organizations need to ascertain that the training they provide is in line with their operational and business goals. The training should result in useful skills for “citizen data analysis” and the maximization of data used to promote innovation and improvements in business outcomes.

5. Implementing data quality measures to build trust in data

It is difficult to promote data use if the data available in an organization is unreliable. That’s why it is crucial to establish data quality standards. All the data maintained in an organization should be accurate, complete, consistent, and relevant. The data stored should be properly labelled and structured. Timestamps should also be correctly generated to make it easy to find the most recently uploaded information.

Organizations need a good data management system to ascertain that they only keep and use accurate and relevant data. Unnecessary duplicates must be removed to maximize storage space. Updated data should be properly labeled, and old or obsolete data must be erased unless they are deemed necessary for historical contextualization.

In some cases, it helps to appoint a data steward to resolve data conflicts or perform enrichment to find missing information or fix incomplete datasets.

Organizations can use tools like Datafold to automate data testing and quickly spot data quality issues. The platform can perform sophisticated reconciliations between versions of the same data set. It also executes continuous integration and deployment (CI/CD) testing, enabling users to maximize data visibility and obtain insights into the impact of code changes.

Conclusion

Data should be an asset for businesses. It is unfortunate that not enough organizations take advantage of their data repositories, resulting in unexplored or untapped potential. Teams can use data to improve decision-making, products and services, marketing and sales efforts, customer relations, and many other areas. To get started with productive data use, it helps to demolish information silos, establish a data-driven workplace, use AI-enhanced business intelligence solutions, provide data analysis and management courses, and implement data quality measures to build data trustworthiness.

]]>
stan.beer@itwire.com (Cliff Stanton) Data Mon, 19 Aug 2024 16:53:08 +1000
From SEO to LMO: HubSpot launches the first free tool for AI discovery https://itwire.com/business-it-news/data/from-seo-to-lmo-hubspot-launches-the-first-free-tool-for-ai-discovery.html https://itwire.com/business-it-news/data/from-seo-to-lmo-hubspot-launches-the-first-free-tool-for-ai-discovery.html From SEO to LMO: HubSpot launches the first free tool for AI discovery

PRODUCT ANNOUNCEMENT:  AI Search Grader analyses brand awareness and sentiment across AI chatbots as more customers turn to tools like ChatGPT for search

AI Search Grader analyses brand awareness and sentiment across AI chatbots as more customers turn to tools like ChatGPT for search

Customers and prospects are ditching traditional search in favour of researching with AI for quick, personalised answers. In fact, usage of tools like ChatGPT to answer questions is up 37%, while search engines are down 11%.* For marketers who’ve been focused on getting their brands discovered with SEO, now is the time to master a new craft: Language Model Optimisation (LMO).

At HubSpot, we’ve always given marketers the tools they need to connect with their customers and drive growth. We did it with our Website Grader that helped millions of marketers optimise for search engines. And we’re doing it today with the launch of AI Search Grader, the first free tool that helps brands understand how they show up in Large Language Models (LLM) and AI search.

“What marketers have been doing for years to attract customers won’t be as effective in the future. People just aren’t using traditional search like they used to,” said Nicholas Holland, VP of Product and GM of Marketing Hub at HubSpot“As the tides are turning toward AI search, we want to help marketers stand out and get discovered in this new environment. That’s exactly what AI Search Grader will give marketers – a new, free way to connect with their customers in the AI era.” 

AI Search Grader takes marketers from SEO to LMO

Today, marketers are manually testing how their brands are showing up in AI search: visiting different experiences, crafting specialised prompts, aggregating responses, and synthesising results – all just to get a fraction of the information they need. And there’s little guidance to help them act on what they find.

HubSpot’s AI Search Grader eliminates the need for AI expertise, and takes the guesswork out of how a brand is showing up in AI search. It does prompt engineering for marketers and contextualises their brand’s performance to make LMO simple. 

AI Search Grader includes four key components: overall grade, brand sentiment score, share of voice score, and personalised analysis.

AI Search Grader is free and globally available in English for any marketer to use.

]]>
stan.beer@itwire.com (HubSpot) Data Thu, 15 Aug 2024 10:54:33 +1000
INTERVIEW - From Sydney to Silicon Valley: HP’s David McQuarrie's Top Tips to Global Tech Leadership https://itwire.com/business-it-news/data/from-sydney-to-silicon-valley-hp%E2%80%99s-david-mcquarrie-s-top-tips-to-global-tech-leadership.html https://itwire.com/business-it-news/data/from-sydney-to-silicon-valley-hp%E2%80%99s-david-mcquarrie-s-top-tips-to-global-tech-leadership.html INTERVIEW - From Sydney to Silicon Valley: HP’s David McQuarrie's Top Tips to Global Tech Leadership

In an exclusive interview with ITWire, David McQuarrie, HP's Global Chief Commercial Officer, shared invaluable lessons from his extraordinary journey across the globe.

Hailing from Australia, McQuarrie’s career has evolved from regional roles in Europe to influential positions in the global boardrooms of HP in the United States. His story underscores the importance of diverse experiences in shaping and transforming global leadership in the dynamic tech industry.

Tip 1: Embrace Creativity and Perseverance
Growing up in Sydney, McQuarrie was deeply influenced by the values his homeland embodies.

“Having lived and worked in a lot of places in the world, the ability to be imaginative is something Australians are good at,” McQuarrie said. “They’ve had to be resilient and ingenious in solving problems.”

These strengths have built a strong foundation for his career and are invaluable to companies in the current tech landscape. McQuarrie’s cultural blend of creativity and perseverance became his crucial strength as he navigated global workplaces across continents.

Tip 2: Foster Inclusivity
Leading a diverse team at HP, McQuarrie emphasized the importance of inclusivity, a value that sits at the heart of the companys culture. “It feels like a family,” he said, reflecting on the strong connection between leadership and employees through decades of continuous transformation. This culture of inclusivity has strengthened internal trust and bolstered consumer confidence in HP.

“It’s one of the things that attracted me to the company,” McQuarrie said. “It’s an organisation people have trusted since the very beginning.”

Tip 3: Trust is Crucial
Trust is also a foundational cornerstone of HP’s approach to emerging technologies, including its growing AI strategy. Despite consumers concerns about AI, McQuarrie remains optimistic about its potential.

“Our role is to ensure that we are providing the best, most secure, and most trusted technology, where people feel protected and empowered,” he said.

He envisions AI as a transformative tool that will enable personalised working and living, enhance the HP hybrid experience and shape a future of growth and well-being.

As HP continues to innovate, McQuarrie’s journey from Australia to a global leadership position offers an inspiring lesson in values, transformation, and trust in technology leadership.

]]>
stan.beer@itwire.com (Team Writer) Data Mon, 12 Aug 2024 21:25:58 +1000
An AI strategy isn't enough; you need prepared data, and Cloudera says it has the platform to help you succeed https://itwire.com/business-it-news/data/an-ai-strategy-isn-t-enough-you-need-prepared-data-and-cloudera-says-it-has-the-platform-to-help-you-succeed.html https://itwire.com/business-it-news/data/an-ai-strategy-isn-t-enough-you-need-prepared-data-and-cloudera-says-it-has-the-platform-to-help-you-succeed.html An AI strategy isn't enough; you need prepared data, and Cloudera says it has the platform to help you succeed

Hybrid data, analytics, and AI platform Cloudera says AI will add about $US15.7T - that's trillion - to the global economy by 2030. To get there your data needs to be in order, and Cloudera says it has the best platform to help you succeed, and not be left behind.

Cloudera CEO Charles Sansbury and other executives took to the stage at Cloudera Evolve 2024 APAC being held in Singapore this week, to announce new product offerings, share customer stories, and talk about the company's growth.

A continual, recurring theme of the event was how vast the Cloudera customer base is, and the enormous scale of data being managed by the platform. In fact, Sansbury said, the company brings in more than a billion USD in revenue, is used across the top global 2000 customers including eight out of 10 banks, and has over 25 exabytes of data under management.

"That's more data than any other company in the world," Sansbury said, pictured below.

CharlesSansburyCloudian

{loadposition david08}

An exabyte is 1,000,000,000,000,000,000 bytes. We can relate to a terabyte - that's 1000 gigabytes, which in turn is 1000 megabytes. We have disk drives measured in terabytes now. Well, 1000 terabytes is called a petabyte. And an exabyte is 1000 petabytes. Some people say all the words ever spoken by humanity would sum to 5 exabytes. And Cloudera manages more than five times that much.

It's so much data that Sansbury says some of Cloudera's customers - a subset of that 25 exabytes - have more data individually than all the data contained by some of the public cloud hyperscalers across all their customers combined.

Obviously then, Cloudera is not itself using those hyperscalers as its backbone for data. Well, not completely. What's going on here is that Cloudera is a hybrid platform - in fact, Sansbury says it is the only true hybrid platform - working with customer data wherever it may be. Some data may be on-premises, some on this cloud, some on that cloud. No matter where your data is, Cloudera believes you should own and govern your own data, not vendors. Its products and tools will seamlessly accomodate. As such, Cloudera can be deployed in your own environment or on a public cloud, even multiple public clouds.

Cloudera is a billion dollar company today, but it was not always the case. The business began life in June 2008 by a team of Google, Yahoo!, and Facebook engineers, and an Oracle database executive, later being joined by a Hadoop co-founder the following year. Originally, Cloudera offered a free product based on Hadoop with revenue based on support and consulting. This morphed into a commercial Hadoop distribution. From 2009 to 2011 the company received $US 70 million of investment funding. In 2014 Cloudera raised another $US 160m from investors and a whopping $US 740m from Intel in exchange for an 18% stake of the company. It is almost a history lesson in the rise of analytics and big data and the role Hadoop played at the time.

Yet, the story changed as public cloud services emerged. As you might expect in the typical Silicon Valley journey, Cloudera went public in 2017 only to find its share price drop due to falling sales and stiff competition from public cloud services like AWS. Intel sold back its 18% ownership for less than half what it paid, $US 314m, in 2020.

As Cloudera rose to the challenge of differentiating itself - and strategically foresaw the valuable role of open source table formats like Iceberg, as well as the shift from all-in on public cloud to hybrid multi-cloud. Cloudera proved to be prescient on these and in late 2021 went private after an all-cash acquisition by KKR and Clayton, Dubilier and Rice. Today, Cloudera remains as a public company, but states it has over one billion USD in revenue, and is profitable to the tune of hundreds of millions. Meanwhile, other prominent data platform products are now rapidly embracing Iceberg also, while businesses globally are seeking to reduce their cloud bills by moving workloads to more appropriate locations, which might include on-premises but could be other clouds. In short, the trends Cloudera foresaw have come to pass, and Cloudera finds itself in a strong position to assist organisations of all sizes in their data and AI journey, especially as a first mover in adopting Iceberg as central and native to its platform.

It's some distance from Cloudera's roots as sifting through huge stores of corporate information to a major player helping customers prepare their data for AI purposes, be they tightly regulated financial industries, or industries with significant security, privacy, and sovereignty requirements, or even simply any company that wants to be more efficient with its spending while still achieving its goals.

In fact, one Cloudera customer who most definitely has security front-of-mind is the Australian Federal Police, who were named Cloudera's ANZ Customer of the Year during the event.

The drive to infuse, embed, and leverage AI is now so great that it is projected to add a huge 15.7 trillion US dollars to the global economy by as soon as 2030.

Yet, wanting to do AI is different from doing AI. And having data is most definitely different from having prepared and organised and governed data that's ready to use.

Sansbury has previously likened the AI journey to a field; you want to plant crops but boy, there's a lot of work that goes before it. You must plow the field into all kinds of nice rows. But there's dirt to shift, there are rocks to remove, you may need to boost soil and nutrients. Finally, your field is prepared. Only then can you plant crops and expect great results.

So too with data; Cloudera conducted research that formed the basis for its state of enterprise AI and modern data architecture report, revealing the top barriers to AI adoption presently are security and compliance concerns, lack of training and skills to manage AI, and the sheer cost of AI adoption itself.

RemusLimCloudera

Even so, ignore AI at your peril. “The gap between businesses leveraging AI and those lagging behind is widening rapidly,” said Cloudera SVP APJ Remus Lim, pictured above.

Here's where Cloudera comes in; the product is no mere database but a platform that, using Sansbury's analogy, helps you plow your field and get your rows lined up ready for planting, by assisting you in ensuring a strong data foundation.

“The need for a strong data foundation has never been clearer," Lim said. "AI is the key that unlocks the true potential of this valuable asset, but it is just the first step. In fact, our recent survey showed that 88% of enterprises are adopting AI in some capacity. However, organisations are seeking greater support in terms of data infrastructure and skills to operationalize their AI and access all of their data such that critical insights are trustworthy and unbiased; only then will the truly transformative impact of AI be seen.”

And, by embracing hybrid multi-cloud and working with your data where it resides, Cloudera can eliminate much of the bulky cost of shifting data to, and storing it in, public clouds.

"Cloudera is the only true hybrid platform for data analytics and AI. We enable global enterprises to use data to solve the impossible today," Lim said.

]]>
stan.beer@itwire.com (David M Williams) Data Wed, 07 Aug 2024 19:17:41 +1000
Cognizant Moment: Powering the next generation of data-led, AI-fueled experiences at enterprise scale https://itwire.com/business-it-news/data/cognizant-moment-powering-the-next-generation-of-data-led,-ai-fueled-experiences-at-enterprise-scale.html https://itwire.com/business-it-news/data/cognizant-moment-powering-the-next-generation-of-data-led,-ai-fueled-experiences-at-enterprise-scale.html Ben Wiener, global head of Cognizant Moment

New digital experience practice will harness AI for enhanced customer engagement and business innovation

COMPANY NEWS: Cognizant announced the launch of Cognizant Moment™, the next evolution of the company's digital experience practice area, designed to help clients leverage the power of artificial intelligence (AI) to reimagine customer experience and engineer innovative strategies aimed at driving growth.

The new practice builds on the digital experience expertise and solutions Cognizant has delivered for clients over the last 20+ years, and advanced through a series of key acquisitions in the digital experience space. Now, as the ways consumers interact with technology are shifting to include multi-modal experiences, Cognizant aims to give clients the tools and insights they need to drive differentiation, cultivate customer loyalty and become future-ready.

Cognizant Moment™ is built on the foundation of intelligent ecosystem orchestration, a strategy that connects experiences as well as the underlying data, technology and operations across the entire enterprise ecosystem. This approach enables clients to leverage generative AI's content generation capabilities, along with human ingenuity, to innovate, differentiate and drive growth by informing and automating processes, and creating dynamic, hyper-personalised experiences for their customers.

"Every experience comes down to a series of moments, seamlessly enabled by intuitive strategy, human-centred design and curated technology," said Ravi Kumar S, CEO, Cognizant. "Cognizant research shows that a majority of G2000 business decision makers believe that generative AI will help them create new products and services, and many of them are already using the technology to design or deliver them. We aim to meet the moment, as enterprises move to differentiate through experience, and enable them to implement a data and technology-driven approach to shape the customer journey, rather than relying on traditional marketing agencies."

Cognizant Moment aims to address these challenges directly, with a suite of capabilities across multiple workstreams, including:

  • Intelligent Ecosystem Orchestration: Connecting data, technology and operations to create a dynamic experience ecosystem;
  • Business Transformation: Innovating and reimagining operations to assist clients in generating new growth and economic value;
  • Digital Products & Platforms: Using human-centric agile methods to deliver products that resonate with customers and employees;
  • Marketing & Content: Transitioning from manual management to AI-driven personalised marketing, supported by comprehensive content services;
  • Commerce: Moving beyond transactions to create immersive commerce experiences;
  • Learning: Crafting interactive experiences to enable behaviour change and skill-based performance.

"Cognizant Moment is one of the few modern experience practices that can deliver at scale, globally, and with the cutting-edge technical expertise required to effect the needed change," said Ben Wiener, global head of Cognizant Moment. "As experience leaders grapple with a confluence of technological and behavioural forces, we aim to help clients change the game and equip them to grow and thrive competitively, backed by Cognizant's long heritage of digital experience domain knowledge, technology and industry expertise."

"The effectiveness of technology implementations must be measured by the experiences of both customers and employees," said Phil Fersht, CEO and Chief Analyst, HFS Research. "Cognizant Moment brings the firm's best assets and talent together to maximise AI impact and people value."

According to *New Work, New World*, a research report published by Cognizant analysing data from Oxford Economics, generative AI could inject an additional 3.5% of productivity growth to the US economy by 2032, estimated at $1 trillion per year. Additionally, by the same year, up to 90 percent of jobs could be disrupted in some way by generative AI, from administrative assistants to CXOs. By helping clients drive productivity gains with generative AI and changing the way marketers connect with customers, Cognizant Moment aims to play a central role in shaping the future of customer experience.

To learn more about Cognizant Moment, visit this page.

About Cognizant
Cognizant (Nasdaq: CTSH) engineers modern businesses. We help our clients modernize technology, reimagine processes and transform experiences so they can stay ahead in our fast-changing world. Together, we're improving everyday life. See how at www.cognizant.com or @cognizant.

]]>
stan.beer@itwire.com (Cognizant) Data Thu, 01 Aug 2024 11:04:12 +1000