Retrieval practice taken before learning materials may enhance memory retention better than restudy, known as pretesting effect. The underlying mechanism, however, remains much to be discussed. Error is considered to play an important role at encoding in the pretest process. The current study tries to take a glimpse at the underlying mechanism by simulating the results of Kornell and his colleagues (2009) utilizing a biologically plausible model based on error-driven learning. And considering error correction hypothesis, which predicts that the error size in the pretest is proportional to learning performance, the study further addressed the relationship between error magnitude and memory retention. Finally, the thesis refined the pretesting procedure to a newer and more realistic condition in the model. The results suggest that error-driven learning might be the mechanism of pretesting effect, and error magnitude positively predicts memory improvement. And the refined model with one more study module also predicts the same effect, further validating the pretesting effect under error-driven learning model. Limitations and future directions are discussed.
-
Notifications
You must be signed in to change notification settings - Fork 0
keyou666/leabra-neural-network
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published