You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when people go to fix an issue that a bot makes, the bot doesn't improve and sometimes will write over the correction a person makes, because it's not written in the code to improve it.
Bots continue to make the same mistakes that have to be manually corrected.
Solution:
If there's a bot that goes in and helps other bots improve their sources, then they can be corrected to not cause further mistakes.
There can be a bot that patrols the Open Library website to check for bots duplicating other works, making mistakes, etc.
- how it does this is by double checking with an internet search and a full text search and analysis of which choice is correct among many.
have a self improvement feature built into bots
Constraints
if someone write something incorrect after what a bot writes, and a self-correcting feature uses the incorrect information to replace its own, then the original information can get lost or be replaced by incorrect info
To fix this would require:
- continuously booting off users that fill in incorrect info
- keeping a copy of the original information in a database, while the bot works off a corrected one.
The text was updated successfully, but these errors were encountered:
Issue:
Solution:
- how it does this is by double checking with an internet search and a full text search and analysis of which choice is correct among many.
Constraints
- continuously booting off users that fill in incorrect info
- keeping a copy of the original information in a database, while the bot works off a corrected one.
The text was updated successfully, but these errors were encountered: