-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proofs and Theorem Prover #1
Comments
Hello! The accepted proof will be non-empty only when you successfully fill CORGI's missing knowledge through user dialog. To better help, can you tell me which statement you tried and what your responses to CORGI's prompts were. You are right thanks for bringing this up! It looks like I only released the pre-trained model. I will include the training scripts and ping you when the code is up. Thanks! |
Thanks for your response. Here are four statements that I responded to: Statement: If I have an early morning meeting then wake me up early because I want to be ontime. Statement: If there are thunderstorms in the forecast within a few hours then remind me to close the windows because I want to keep my home dry. Statement: If I schedule an appointment that overlaps with another appointment then notify me immediately because I want to let my colleagues know of the conflict. Statement: If I search for a gas station in the navigation app and there is a cheaper gas station that is not too much further away then ask me immediately whether I want to switch the destination to the new gas station because I want to save money. |
Hello and sorry for the slow response! In general (if there are no installation issues or files missing), being able to generate a proof or not depends on whether the user interacting with the reasoning engine is able to give the engine the knowledge needed for completing a proof. Another factor that affects the reasoning engine's capability in generating a proof is how complete the knowledge base is in the first place. The current knowledge base is intentionally sparse and has a lot of missing knowledge. This is because we would like to see if it is feasible to rely on human interaction for commonsense reasoning. Let me know if this description is not clear. |
Hello, I got the same results as @AdamIshay . As you suggested answering "if i wake up at 8" to the first question does complete the proof. But I was wondering If I were doing something wrong when I got the following responses: statement : If there are thunderstorms in the forecast within a few hours then remind me to close the windows because I want to keep my home dry. 0 : sorry, i do not know how to do what you asked me :( Here, in the two cases above, I am giving a statement that directly maps the goal to the query and is an example from the paper. CORGI is able to "match the goal". So, the proof should have been completed. But it is failing in both cases. Frequently I also noticed, CORGI doesn't ask me a second query on the subgoal but fails after it is unable to complete the proof using the first input. What could be the reason for that? Thank you so much! |
Hello,
When I run the reasoning.py file, the acceptedProof is always empty, is this expected? How do I see generated proofs?
Additionally, I don't see the neural theorem prover learning in reasoning.py, I am wondering if I am missing something or maybe there is another instruction to run the theorem prover learning.
Thank you!
The text was updated successfully, but these errors were encountered: