Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support read/write file from Amazon Web Service S3 Bucket using the _FileDataNodeMixin #2350

Open
2 of 7 tasks
trgiangdo opened this issue Dec 18, 2024 · 0 comments
Open
2 of 7 tasks
Labels
✨New feature 🔒 Staff only Can only be assigned to the Taipy R&D team

Comments

@trgiangdo
Copy link
Member

trgiangdo commented Dec 18, 2024

Description

File-based datanode should support a file from AWS S3 Bucket when the path is a path on AWS S3.

Take PickleDataNode for example:

from taipy import Config

Config.configure_pickle_data_node(
    id="aws_pickle_dn",
    path="https://my-example-bucket.s3.us-east-1.amazonaws.com/folder1/file.p",
    scope=Scope.GLOBAL,
    aws_access_key="...",
    aws_secret_access_key="...",
    ...
)

Similarly for other file-based data node.

Solution Proposed

Use the boto3 library.

The implementation should be similar to the S3ObjectDataNode

Acceptance Criteria

  • If applicable, a new demo code is provided to show the new feature in action.
  • Integration tests exhibiting how the functionality works are added.
  • Any new code is covered by a unit tested.
  • Check code coverage is at least 90%.
  • Related issue(s) in taipy-doc are created for documentation and Release Notes are updated.

Code of Conduct

  • I have checked the existing issues.
  • I am willing to work on this issue (optional)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
✨New feature 🔒 Staff only Can only be assigned to the Taipy R&D team
Projects
None yet
Development

No branches or pull requests

2 participants