Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

♥️ Function Calling-only model #297

Open
finom opened this issue Feb 13, 2024 · 4 comments
Open

♥️ Function Calling-only model #297

finom opened this issue Feb 13, 2024 · 4 comments

Comments

@finom
Copy link

finom commented Feb 13, 2024

I'm very impressed of the work made by web-llm team and really wanted to try to integrate it into an existing project. Unfortunately current models that capable to do something useful too heavy and my average-performant laptop doesn't feel well when I run the common LLMs in the browser such as LLama2. I'm a web developer and I can't promote the project to my clients since it consumes too much resources making computers to glitch. For common use-cases I'm still forced to use third-party APIs such as OpenAI. But what bothers me is the idea to have a small model that is unable to perform to do the common text completion actions but capable to generate Function-Calling-like objects from user's input.

I'm not a data scientist at all, and my work load (currently) doesn't allow me to find out the solution by myself, but as I fantasise I can imagine that web-llm could gain a lot of real users (not just geeks like me who likes to try things) by providing a way to turn user input into actions within the browser. My hypothesis is not based on real experience but on intuition that often is wrong in such situations. I guess that such model would be quite performant and compact because it doesn't need to know about "everything" and I also guess that this would be a real killer feature that would make everybody look at web-llm as on a real business solution, not just an experiment that is going to have more sense in decades when average computers are going to be much more performant.

Good idea? Bullshit? If it makes at least some sense I hope someone who's not that dumb as me could take care of that. Let me know what you think!

@CharlieFRuan
Copy link
Contributor

CharlieFRuan commented Feb 13, 2024

Really appreciate your input here @finom! Function calling is part of our roadmap as shown in O2 in here: #276. The goals are to support OpenAI-like APIs and features (function calling, multimodal inputs, embeddings, etc.)

Hopefully we can start from here and then let WebLLM make actual actions in the browser!

@finom
Copy link
Author

finom commented Feb 13, 2024

@CharlieFRuan that sounds fantastic! I imagine a lot of interesting use-cases that I would be able to integrate into internal tools that I usually build. Fingers crossed!

@monarchwadia
Copy link

I was excited to use web-llm precisely because of the potential of doing function-calling within the browser environment. very much looking forward to this feature :-)

@finom finom changed the title Function Calling-only model ♥️ Function Calling-only model Feb 23, 2024
@CharlieFRuan
Copy link
Contributor

Hi! Please check out examples/function-calling. It is supported in 0.2.41, via PR #451. Currently only Hermes-2-Pro models are supported:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants