On the latest episode of the Data Humanized Podcast, Host Mark Palmer interviewed Fernando Schwartz. Fernando is the Vice President (VP) of Data Science & ML Engineering at ADP. In the episode, the pair discuss how to:
Keep reading for the top takeaways from Mark and Fernando’s conversation.
Generative AI, while incredibly efficient, doesn’t always grasp the full context or nuances in data and relationships the way humans do.
That’s why it’s essential to prep the content you feed into generative AI and LLMs.
Fernando shared that generative AI and LLM output is only as good as the inputs you give the technology.
To generate high-quality output, he recommends using ontologies or tags to add context.
Structuring knowledge for ontologies has traditionally been human-intensive and slow, but Fernando and his team have generative AI take on the heavy lifting.
Fernando also recommends making content inputs AI-ready by doubling down on generative AI and LLM use. He shared that teams at ADP use LLMs to not only reformat data but also to evaluate if reformatted outputs are better than original inputs.
There’s no doubt that generative AI has transformed business operations with the speed, scalability, and efficiency it offers. However, it still requires human intervention to reach its full potential.
During the podcast episode, Fernando stressed the critical role people have to play in enterprise generative AI applications.
He noted that employees are needed to increase the accuracy of generative AI outputs — especially at organizations like ADP where data accuracy is essential.
AI and LLM technologies are expected to increase exponentially in enterprises like ADP. To keep up with the evolution of these technologies, technical and non-technical employees alike need proper upskilling.
Fernando shared his thoughts on the need for AI and LLM workforce skills on the podcast:
As enterprises like ADP explore how to integrate generative AI, customer experience workflows are ready for its application.
This use case is not just about efficiency, but about rethinking the entire customer journey and leveraging generative AI to understand and anticipate customer needs.
For example, ADP has leveraged AI to better serve its customers when they use the search bar. Their application of AI picks up on potential customer needs and proactively suggests a solution.
During the podcast, Fernando reflected on the potential of generative AI to improve customer experiences:
A challenge enterprises face is integrating data from multiple sources.
When teams collect and analyze data, it’s typically done with a specific use case or department goal in mind.
However, this narrow perspective overlooks the value data can bring when it's integrated into a broader organizational context.
Fernando highlighted the importance of telling data teams how their work fits into the larger strategy on the podcast:
Fernando said data teams must understand that their contributions are not just department-specific but part of a larger repository of data that fuels insights organization-wide.
Generative AI and LLMs are a game-changer for enterprise operations. These technologies improve organizational outcomes by:
As these technologies become more embedded in organizations, people leaders must focus on thoughtful application, workflow integration, and developing the digital skills of employees.
Partner with Correlation One to equip your workforce with advanced AI and LLM skills. Get started today.