You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please confirm if feature request does NOT exist already ?
I confirm there is no existing issue for this
Describe the usecase for the feature
ollama is a good way to run local languase model(like llama3 8b),and the AI can easily input the so much inorder data quickerly as we want,eg,we copy a lot of information of one person,we just paste in the ai ,and ask it to write in we already set table,it can quickerly write in correct location in the table insted of we one by one.And also ,chat with ai can easily darw picture of data .And the obsidian is a markdown soft,which can easily write so many things ,if we write in ob and upload the data to Nocodb,and then chat with the information auto .
Suggested Solution
Additional Context
The text was updated successfully, but these errors were encountered:
I'm afraid there is almost no connection between NocoDB and Obsidian, Ollama. Perhaps in the future, there will be specialized plugins to convert databases into markdown table formats supported by llms?
Please confirm if feature request does NOT exist already ?
Describe the usecase for the feature
ollama is a good way to run local languase model(like llama3 8b),and the AI can easily input the so much inorder data quickerly as we want,eg,we copy a lot of information of one person,we just paste in the ai ,and ask it to write in we already set table,it can quickerly write in correct location in the table insted of we one by one.And also ,chat with ai can easily darw picture of data .And the obsidian is a markdown soft,which can easily write so many things ,if we write in ob and upload the data to Nocodb,and then chat with the information auto .
Suggested Solution
Additional Context
The text was updated successfully, but these errors were encountered: