You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using WSL2 and Ubuntu 22.04 and have updated to the latest yearterday. fabric -d fails to change the default model. The evidence is the fabric --dry-run. Some questions are the purpose of streaming in 1.0 -sp=pattern name now 2.0 excludes streaming.
Are there any example of using sessions and context properly.
Below is fabric -d fails to change the default model, adding m=gemini-1.5-pro-latest is a workaround for the default:
Available models:
Gemini
[25] gemini-1.5-pro-latest
Enter the index the name of your default model (leave empty for 'llama3.2:1b' or type 'reset' to remove the value):
25
Dr. Thitima's CS class has a group project. Students will design two AI-generated educational puzzles. One puzzle is based on lectures one through four, the other on OWASP Top 10.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I am using WSL2 and Ubuntu 22.04 and have updated to the latest yearterday. fabric -d fails to change the default model. The evidence is the fabric --dry-run. Some questions are the purpose of streaming in 1.0 -sp=pattern name now 2.0 excludes streaming.
Are there any example of using sessions and context properly.
Below is fabric -d fails to change the default model, adding m=gemini-1.5-pro-latest is a workaround for the default:
pheller@TwinTower2:/mnt/o/ollama.ai/Project/files_to_convert$ fabric -d
Available models:
Gemini
[25] gemini-1.5-pro-latest
Enter the index the name of your default model (leave empty for 'llama3.2:1b' or type 'reset' to remove the value):
25
DEFAULT_VENDOR: Gemini
DEFAULT_MODEL: gemini-1.5-pro-latest
pheller@TwinTower2:/mnt/o/ollama.ai/Project/files_to_convert$ fabric --dry-run
Dry run: Would send the following request:
Options:
Model: llama3.2:1b
Temperature: 0.700000
TopP: 0.900000
PresencePenalty: 0.000000
FrequencyPenalty: 0.000000
empty response
pheller@TwinTower2:/mnt/o/ollama.ai/Project/files_to_convert$ text2file ThitimaGameDesign.pdf |fabric -p=extract_wisdom_agents -m=gemini-1.5-pro-latest
SUMMARY
Dr. Thitima's CS class has a group project. Students will design two AI-generated educational puzzles. One puzzle is based on lectures one through four, the other on OWASP Top 10.
IDEAS
Beta Was this translation helpful? Give feedback.
All reactions