Skip to content

Getting Serious with Large Language Models: A Practical Introduction

License

Notifications You must be signed in to change notification settings

debatelab/beyond-chatting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

    __                               __  
   / /_  ___  __  ______  ____  ____/ /  
  / __ \/ _ \/ / / / __ \/ __ \/ __  /   
 / /_/ /  __/ /_/ / /_/ / / / / /_/ /    
/_.___/\___/\__, /\____/__ /_/\__,_/     
  _____/ /_  ___/ _/ /_/ /_(_)___  ____ _
 / ___/ __ \/ __ `/ __/ __/ / __ \/ __ `/
/ /__/ / / / /_/ / /_/ /_/ / / / / /_/ / 
\___/_/ /_/\__,_/\__/\__/_/_/ /_/\__, /  
                                /____/   

Getting Serious with Large Language Models:
A Practical Introduction

  • 🕹️ playful and explorative learning
  • 🪜 step by step
  • ⏱️ short and simple
  • 🦆 no coding skills required to get started
  • 🛤️ putting you on track to use AI like a pro
  • 🏡 all-local AI
  • ❤️ open source models and software

Lessons

  • Lesson 01: ⛓️ Chains and 💬 Chats with 🫥 Placeholders [notebook]
  • Lesson 02 (planned): Devise and Control Sophisticated Workflows (If-Then-Conditions, Branches, Iterations, etc.)
  • Lesson 03 (planned): Connecting LLM-Workflows To Your Data
  • Lesson 04 (possible): Understanding and Exploiting Decoding Techniques
  • Lesson 05 (possible): Structured and Constrained Output
  • Lesson 06 (possible): Integrating Tools
  • Lesson 07 (possible): Multi-Agent Workflows
  • Lesson 08 (planned): Unravelling Beyond-Chatting, and Preparing You for the Real Stuff

Installation (might take >1h)

💡 INFO

If you're unsure with any of the following, don't hesitate to head over to huggingface.co/chat and talk this through with a strong LLM. If you still don't feel confident enough to move ahead: team up with friends, or ask a colleague for help.

Required or recommended:

  • A local LLM inference server
  • Git
  • Python
  • VS Code
  • VS Code Python Extension

Beyond-chatting assumes that you're running a LLM on your own computer. To do so, install

download, as described in your LLM App's documentation, a local model; and start an OpenAI-compatible inference server. Note: I've been testing the course with 🦙 meta-llama/llama-3.2-3b-instruct.

In your > Terminal app: set up git, which you'll need to download the beyond-chatting course, as pointed out here.

To set up python, I suggest you install uv as described here, and then install the latest python version following these instructions.

To set up VS Code, download and install the code editor.

To set up the VS Code Python Extension, follow the instructions here (some more background).

For a pleasant and less intimidating look, you might consider to install the catppuccin theme and the corresponding icon set.

Now, to finally get beyond-chatting, in your > Terminal app cd into the folder where you plan to store the beyond-chatting course and clone the repo with git:

cd my-projects
git clone https://github.com/debatelab/beyond-chatting.git

This downloads the course in a newly created beyond-chatting folder.

Next:

cd beyond-chatting
uv venv  # install packages for beyond chatting

Then start VS Code and open the beyond-chatting folder as a workspace.

🎉 Congrats. You're now ready to start.

Other Learning Resources

About

This is currently a side project of mine. In case the course is picked up and considered useful, I'm happy to expand the tutorials.

Please ⭐️ star this repo in case you think it's useful.

Feel free to suggest topics that should be covered via Github issues.

About

Getting Serious with Large Language Models: A Practical Introduction

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published