Skip to content

soumilshah1995/Smart-way-to-Capture-Jobs-and-Process-Meta-Data-Using-DynamoDB-Project-Demo-Python-Templates

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[license]

Smart way to Capture Jobs and Process Meta Data Using DynamoDB | Project Demo | Python Templates | Alerts

Overview:

In this article, I will present a solution that will allow you to easily monitor and capture status for running jobs and tasks. Capturing the details allows us to determine how long a process takes, what the status of the process is, and if necessary, dive into Task level details. When a job runs, it generates a unique process (GGUID), which represents the running or ongoing work. The process will have a start and end time and will display the status of ongoing activities. Each task in the process will have a name, a start and end time, and a status. If a task fails, the process status will be marked as failed. If a user needs more visibility for a function, they can simply log the function with decorator and all details will be captured in dynamodb for that task. I will demonstrate how to design and implement these solutions.

Architecture

capture drawio

Frontend

image

  • Shows all process that ran for given Day p2
  • Shows all Task for Given Process for given Day (Query GSI View )

Alerts

image

  • Sends Alerts for failed process using kinesis streams lambda and SNS

change and add your email where you would like

please email address in severless.yml Email address will recive alerts for failed process

  MySubscription:
      Type: AWS::SNS::Subscription
      Properties:
        Endpoint: your_email.com
        Protocol: email
        TopicArn: !Ref 'SNSTopic'

Install and Deploy Stack

command 1: npm install -g serverless

command 2: serverless config credentials --provider aws --key XXXX  --secret XXXXX -o

command 3: serverless deploy

How to Use

class Jobs(object):
    def __init__(self):
        self.process_instance = Process()
        self.__create_process()

    def __create_process(self):
        self.process_instance.create()
        self.process_instance.progress()

    def run(self):
        response_1 = self.step_1()
        response_2 = self.step_2()

        self.process_instance.success()

    @dynamodb_task()
    def step_1(self):
        print("some business rules and code goes here ")
        print("some more business rules and code goes here ")

    @dynamodb_task()
    def step_2(self):
        raise Exception ("OUCH")
        print("some business rules and fucntion calls logs  ")

Explanations

  • @dynamodb_task()
    
  • Whenever you Decorate the method with dynamodb task this will log the meta data in dynamodb. this logs task start time and end time and status of task
  • Status can be Success | Progress | Failed
Since the method raised exception it marks process Failed

image

Exceptionj we generated in code can benn seen

image

tasks

image

GSI

image

  • GS1 gives you all Task for a given Process

  • GSI2 gives you all process for a guiven day and you can use SK to filter by status or any other things if needed

  • GSI3 gives you all process for month

  • TTL feilds will delete as the records and process get olders


Soumil Nitin Shah

Bachelor in Electronic Engineering | Masters in Electrical Engineering | Master in Computer Engineering |

About

Smart way to Capture Jobs and Process Meta Data Using DynamoDB | Project Demo | Python Templates

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published