You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Because the LLM does not have direct access to the internet, it could be able to return a "special response" forcing the app to download information for specified "thing". For example:
User: What's Radom?
Assistant: {fetch_wikimedia:Radom} The app fetches Wikidata, Wikipedia, Wiktionary, Nominatim(?), etc. to provide information and saves it in local docs. User is informed about it and is requested to wait a while.
System: {fetch_ready} Assistant now has access in cached local docs (special folder) to latest information
Assistant: Radom is a city in central Poland, located in the Masovian Voivodeship, around 100 km south of Warsaw. (continues)
Right now, the assistant sometimes fails to provide valid information or thinks that some towns that are 50km from each other are close. The fetched answer should be cached and if data lake is on, the assistant's response should be sent for training model if it was "liked".
The text was updated successfully, but these errors were encountered:
Feature Request
Because the LLM does not have direct access to the internet, it could be able to return a "special response" forcing the app to download information for specified "thing". For example:
User: What's Radom?
Assistant: {fetch_wikimedia:Radom}
The app fetches Wikidata, Wikipedia, Wiktionary, Nominatim(?), etc. to provide information and saves it in local docs. User is informed about it and is requested to wait a while.
System: {fetch_ready}
Assistant now has access in cached local docs (special folder) to latest information
Assistant: Radom is a city in central Poland, located in the Masovian Voivodeship, around 100 km south of Warsaw. (continues)
Right now, the assistant sometimes fails to provide valid information or thinks that some towns that are 50km from each other are close. The fetched answer should be cached and if data lake is on, the assistant's response should be sent for training model if it was "liked".
The text was updated successfully, but these errors were encountered: