Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feat] Direct file output for LLM responses #15

Closed
PeterDaveHello opened this issue Aug 2, 2024 · 5 comments
Closed

[feat] Direct file output for LLM responses #15

PeterDaveHello opened this issue Aug 2, 2024 · 5 comments
Labels
enhancement New feature or request v0.2

Comments

@PeterDaveHello
Copy link

Hi,

First, this project is cool! The command-line interface for LLMs is handy.

I wonder if you would consider adding a feature to write LLM output directly into a file? This could be very helpful for users who want to save responses for later reference, processing, or as a final result of their interaction with the LLM.

A possible implementation could be a command-line option like -o filename.txt or --output filename.txt.

Thank you for your great work and for considering this suggestion.

@simonmysun
Copy link
Owner

Thank you for your appreciation!

If you want to save to a file, you can redirect stdout or pipe the output to tee, depending on you want to see the result in your terminal or not.

ell "What is the capital of France" | tee result.txt

or if you simply don't need the result to be shown in your terminal:

ell "What is the capital of France" > result.txt

But adding another option for output is also a good idea. This will allow user to have more flexibility. Thank you for the suggestion!

@simonmysun simonmysun added the enhancement New feature or request label Aug 2, 2024
@PeterDaveHello
Copy link
Author

Yeah, it's not too complex to do the pipe things manually 😆 but an option would also be great 😄

@simonmysun simonmysun added the v0.2 label Aug 2, 2024
@simonmysun
Copy link
Owner

Question: What's the expected behaviour in record and interactive mode?

@PeterDaveHello
Copy link
Author

Thanks for considering the feature! How about we record both the input and output of the LLM in both record and interactive modes? What do you think?

simonmysun added a commit that referenced this issue Aug 4, 2024
Breaking change:
- Argument `-o, --option` has changed to `-O, --option`, to avoid conflict with the `-o, --output` option.
- Interactive mode will enable record mode automatically
simonmysun added a commit that referenced this issue Aug 4, 2024
Breaking change:
- Argument `-o, --option` has changed to `-O, --option`, to avoid conflict with the `-o, --output` option.
- Interactive mode will enable record mode automatically
@simonmysun
Copy link
Owner

Implemented.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request v0.2
Projects
None yet
Development

No branches or pull requests

2 participants