You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a Gooey application that acts as a front end for my deep learning script. I currently have 3 subparsers, "create-training-data", "train", and "predict".
parser=GooeyParser()
subparser=parser.add_subparsers(dest='command')
create_data_parser: ArgumentParser=subparser.add_parser("create-training-data")
create_data_parser.add_argument("--samples", type=int, help="The number of samples to download.", default=DEFAULT_ARGS_1.samples)
create_data_parser.add_argument("--training-data-output-dir", type=str, widget="DirChooser", help="Location to save .npy labeled training data ", default=DEFAULT_ARGS_1.training_data_output_dir)
... #many more optionstrain_parser: ArgumentParser=subparser.add_parser("train")
train_parser.add_argument("--training-data-file", type=str, widget="FileChooser", help="The .npy file used to train the network.", default=DEFAULT_ARGS_2.training_data_file)
...#many more options
The current workflow involves running an action (subparser), saving the output to the disk, and then loading that output as one of the options for the next action. This works well, but the loading of these items from the disk takes considerable time due to their size.
Gooey gives us a lovely "Edit" button at the end of an action, which takes us back to the first screen so we can run a different action or change the options.
I would like to add an option to cache the output of one task so it can be used as the input of another if it is used in the same session, instead of saving and loading to a disk. This is because loading neural network models from the disk takes a long time.
Something like:
#`save_training_data` is used when the "create-training-data" action is run, and `load_training_data` is used when the "train" action is run. CACHE= {}
defsave_training_data(data, path):
globalCACHEdata.save(path) #to diskCACHE [path] =datadefload_training_data(path):
globalCACHEifpathinCACHE :
returnCACHE [path]
else:
data=np.load(path) #from diskCACHE [path] =datareturndata
The problem is, Gooey reloads the module upon restarting the script so the CACHE is deleted. Is there a way around this? I do not mind using a hack as a solution, even if it involves patching my local version of Gooey.
The text was updated successfully, but these errors were encountered:
I have a Gooey application that acts as a front end for my deep learning script. I currently have 3 subparsers, "create-training-data", "train", and "predict".
The current workflow involves running an action (subparser), saving the output to the disk, and then loading that output as one of the options for the next action. This works well, but the loading of these items from the disk takes considerable time due to their size.
Gooey gives us a lovely "Edit" button at the end of an action, which takes us back to the first screen so we can run a different action or change the options.
I would like to add an option to cache the output of one task so it can be used as the input of another if it is used in the same session, instead of saving and loading to a disk. This is because loading neural network models from the disk takes a long time.
Something like:
The problem is, Gooey reloads the module upon restarting the script so the CACHE is deleted. Is there a way around this? I do not mind using a hack as a solution, even if it involves patching my local version of Gooey.
The text was updated successfully, but these errors were encountered: