LLM Community Development with Antonio Velasco Fernández and Jose Pablo Cabeza García
Podcast: Play in new window | Download
Subscribe: RSS
LLMs have become one of the most important technologies to emerge in recent years. Many of the most prominent LLM tools are closed source, which has led to great interest in developing open-source tools.
Antonio Velasco Fernández is a Data Scientist and Jose Pablo Cabeza García is a Lead Data Engineer, both at Elastacloud. In this episode, recorded in 2023, they joined the podcast to talk about LLMs and the importance of community development for LMMs.
Jordi Mon Companys is a product manager and marketer that specializes in software delivery, developer experience, cloud native and open source. He has developed his career at companies like GitLab, Weaveworks, Harness and other platform and devtool providers. His interests range from software supply chain security to open source innovation. You can reach out to him on Twitter at @jordimonpmm
Sponsorship inquiries: sponsor@softwareengineeringdaily.com
Sponsors
This episode of Software Engineering Daily is brought to you by Vantage. Do you know what your cloud bill will be for this month?
For many companies, cloud costs are the number two line item in their budget and the number one fastest growing category of spend.
Vantage helps you get a handle on your cloud bills, with self-serve reports and dashboards built for engineers, finance, and operations teams. With Vantage, you can put costs in the hands of the service owners and managers who generate them—giving them budgets, alerts, anomaly detection, and granular visibility into every dollar.
With native billing integrations with dozens of cloud services, including AWS, Azure, GCP, Datadog, Snowflake, and Kubernetes, Vantage is the one FinOps platform to monitor and reduce all your cloud bills.
To get started, head to vantage.sh, connect your accounts, and get a free savings estimate as part of a 14-day free trial.
This episode of Software Engineering Daily is brought to you by Starburst.
Struggling to deliver analytics at the speed your users want without your costs snowballing?
For data engineers who battle to build and scale high quality data pipelines, Starburst’s data lakehouse platform helps you deliver exceptional user experiences at peta-byte scale, without compromising on performance or cost.
Trusted by the teams at Comcast, Doordash, and MIT, Starburst delivers the adaptability and flexibility a lakehouse ecosystem promises on an open architecture that supports – Apache Iceberg, Delta Lake and Hudi, so you always maintain ownership of your data.
Want to see Starburst in action? Get started today with a free trial at starburst.io/sed.
As a listener of Software Engineering Daily you understand the impact of generative AI. On the podcast, we’ve covered many exciting aspects of GenAI technologies, as well as the new vulnerabilities and risks they bring.
HackerOne’s AI red teaming addresses the novel challenges of AI safety and security for businesses launching new AI deployments.
Their approach involves stress-testing AI models and deployments to make sure they can’t be tricked into providing information beyond their intended use, and that security flaws can’t be exploited to access confidential data or systems.
Within the HackerOne community, over 750 active hackers specialize in prompt hacking and other AI security and safety testing.
In a single recent engagement, a team of 18 HackerOne hackers quickly identified 26 valid findings within the initial 24 hours and accumulated over 100 valid findings in the two-week engagement.
HackerOne offers strategic flexibility, rapid deployment, and a hybrid talent strategy. Learn more at Hackerone.com/ai.