Using LLM locally
2024-10-24 , Europe - Main Room

Using LLM isn't limited only to online services (and often paid services); you can run LLM for free by utilizing open-source models and applications.


It has become easy and convenient to use LLM locally on your machine, even for a couple of gigabytes of RAM. Usage includes: text generation, summarization, question-answering, even local design RAGs. All this is available for free and open-source, with minimal setup.

This talk will show a quick demo of some tools and provide references to help you set up your own LLM app.

Pauline's focus gravitates towards offensive cybersecurity, artificial intelligence, and programming culture. She has a background with experience in various fields including linguistics, criminology, cybersecurity, computer engineering, and education. By blending together approaches from humanities and deep technical insight, she provides a unique lens on cyber threats and their evolution. She provides these days AI developments and trainings, to make AI accessible to all. She is the founder of the Defcon group Paris and a French vice-champion para-climber.

This speaker also appears in: