Partner Content Articles and Reports
This section features our sponsored interviews, articles, reports in partnership with some of the most exciting brands in artificial intelligence. Explore our library of partner content below:
The financial sector was one of the first to start experimenting with machine learning applications for a variety of use-cases. In 2019, banks and other lenders are looking to machine learning as a way to win market share and stay competitive in a changing landscape, one in which people are no longer exclusively going to banks to handle all of their banking needs.
This article was written by Sergii Gorpynich, co-Founder and CTO at Star, co-written by Perry Simpson, Managing Director of Star, and was written, edited and published in alignment with our transparent Emerj sponsored content guidelines. Learn more about reaching our AI-focused executive audience on our Emerj advertising page.
Robotic process automation, or RPA, has dominated much of the automation conversation in the insurance industry for several years. RPA is able to capture manual steps that employees take to log into software, search documents, and enter data and replicate them.
The advent of machine learning in finance ushered in a keen interest in using AI to automate processes from fraud detection to customer service. While some use-cases aren’t nearly as established as others, our research leads us to believe that in the coming five years, banks will continue to invest in machine learning for risk-related processes, including underwriting.
Automated loan processing and underwriting is not a new concept in the banking and financial services industry. Lenders have consistently faced pressure to reduce the costs and time associated with internal loans processing and turnaround.
Insurers are looking to leverage all of the digital customer data that is now available to them, including one new data source that some of the largest insurance enterprises claim are actively collecting: real-time data streams from the Internet of Things (IoT).
In 2018, James Kobielus wrote an article on the AI market’s shift to workload-optimized hardware platforms, in which he proposed:
Workload-optimized hardware/software platforms will find a clear niche for on-premises deployment in enterprises’ AI development shops. Before long, no enterprise data lake will be complete without pre-optimized platforms for one or more of the core AI workloads: data ingest and preparation, data modeling and training, and data deployment and operationalization.
We are seeing Kobielus’ words come true. In the past year, nearly 100 companies have announced some sort of AI-optimized IP, chip, or system optimized, primarily for inferencing workloads but also for training. Hyperscalers like Facebook, Amazon, and Google are increasingly talking publicly about "full-stack" optimization of AI, from silicon, through algorithms, up to the application layer.