Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🔦 Feature: How to chat with obsidian and ollama(local language model)? #8374

Open
1 task done
doriszhang2020 opened this issue May 2, 2024 · 1 comment
Open
1 task done

Comments

@doriszhang2020
Copy link

Please confirm if feature request does NOT exist already ?

  • I confirm there is no existing issue for this

Describe the usecase for the feature

ollama is a good way to run local languase model(like llama3 8b),and the AI can easily input the so much inorder data quickerly as we want,eg,we copy a lot of information of one person,we just paste in the ai ,and ask it to write in we already set table,it can quickerly write in correct location in the table insted of we one by one.And also ,chat with ai can easily darw picture of data .And the obsidian is a markdown soft,which can easily write so many things ,if we write in ob and upload the data to Nocodb,and then chat with the information auto .

Suggested Solution

Additional Context

@Lapis0x0
Copy link

Lapis0x0 commented May 21, 2024

I'm afraid there is almost no connection between NocoDB and Obsidian, Ollama. Perhaps in the future, there will be specialized plugins to convert databases into markdown table formats supported by llms?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: 🏌️ Open
Development

No branches or pull requests

2 participants