Serverless has recently become the thing that get everyone excited. So when we were looking at rebuilding a Build Dashboard, serverless was the way to go.
The architecture has five basic components.
-
Build/Deploy Pipeline
Here I’m using AWS CodePipeline, but you can choose any CI/CD system. Using AWS CodePipeline makes it easier to use out of the box metrics. With any other CI/CD system, you would need to push metrics into Amazon CloudWatch depending on the level of granularity you need - pipeline level, stage/step level or job level.
-
CloudWatch Events Rule
Use the metric that defines a change in state for the CI/CD pipeline or the metric you would like to track on your dashboard. Create a CloudWatch Events Rule to track changes to the metric.
Here is a sample CodePipeline Action Level event
{ "version": "0", "id": 01234567-EXAMPLE, "detail-type": "CodePipeline Action Execution State Change", "source": "aws.codepipeline", "account": 123456789012, "time": "2020-01-24T22:03:07Z", "region": "us-east-1", "resources": [ "arn:aws:codepipeline:us-east-1:123456789012:myPipeline" ], "detail": { "pipeline": "myPipeline", "execution-id": 12345678-1234-5678-abcd-12345678abcd, "stage": "Prod", "action": "myAction", "state": "STARTED", "type": { "owner": "AWS", "category": "Deploy", "provider": "CodeDeploy", "version": 1 }, "input-artifacts": [ { "name": "SourceArtifact", "s3location": { "bucket": "codepipeline-us-east-1-BUCKETEXAMPLE", "key": "myPipeline/SourceArti/KEYEXAMPLE" } } ] } }
You can use the API to update the event rule in CloudWatch
aws events put-rule --name "TriggerPipelinesDashboard" \ --event-pattern "{\"source\":[\"aws.codepipeline\"],\"detail-type\":[\"CodePipeline Action Execution State Change\"],\"detail\":{\"state\":[\"STARTED\",\"SUCCEEDED\",\"FAILED\"]}}" \ --role-arn "arn:aws:iam::123456789012:role/pipelinesDashboardTriggerRole"
-
Lambda Function
The Lambda function pushes data into a DynamoDB table. The
DYNAMODB_TABLE_NAME
is an environment variable that can be configured while deploying the lambda function.Function Name: pipelines-dashboard
import logging import boto3 import os def lambda_handler(event, context): """entry point for lambda function""" logger = logging.getLogger() region = event.get("Region", "us-east-1") event_details = event.get('detail') pipeline_name = event_details['pipeline'] pipeline_latest_execution_status = event_details['state'] pipeline_stage = event_details['stage'] # The data structure keys need to match the dynamodb key pipeline_data = { 'PipelineName': pipeline_name, 'PipelineStatus': pipeline_latest_execution_status, 'PipelineStage': pipeline_stage } print(pipeline_data) put_item_in_dynamodb(pipeline_data, region) def put_item_in_dynamodb(pipeline_data, region): dynamodb = boto3.resource('dynamodb', region_name=region) dynamodb_table_name = os.environ['DYNAMODB_TABLE_NAME'] table = dynamodb.Table(dynamodb_table_name) table.put_item( Item=pipeline_data )
While creating the lambds function it is important to note that unless you add the
InvokeFunction
permission for the events service any change in the events would not be able to trigger the lambda function. This is in addition to the permissions defined in the IAM role abovepipelinesDashboardTriggerRole
.aws lambda add-permission --function-name 'arn:aws:lambda:ap-southeast-1:123456789012:function:pipelines-dashboard' --statement-id 987654321 --action 'lambda:InvokeFunction' --principal 'events.amazonaws.com' --source-arn 'arn:aws:events:ap-southeast-1:123456789012:rule/TriggerPipelinesDashboard'
-
DynamoDB Table
I consider DynamoDB as the right choice for this setup since the data is non-relational and the data representation is columnar.
The below API call configures a DynamoDB table named pipelines_dashboard_data.
aws dynamodb create-table --table-name pipelines_dashboard_data \ --attribute-definitions AttributeName=PipelineName,AttributeType=S \ --key-schema AttributeName=PipelineName,KeyType=HASH \ --provisioned-throughput ReadCapacityUnits=2,WriteCapacityUnits=2
-
API Gateway
Configure an API gateway as a proxy for the DynamoDB Table. API Gateway can be a bit tricky to setup, because of the number and sequence of steps involved. Personally, for me, it was a massive trial, error and repeat process. The good part about that is, the readers of this blog needn’t put in that e(rr)ffort.
Using the CLI can get a bit overwhelming to configure the above setup. AWS CloudFormation, can be really great when resolving dependency in service creation and handling complex infrastructure setup. In fact this whole serverless setup can be automated using CloudFormation.
The below template will create and configure an API Gateway as DynamoDB proxy.
Description: Creates a cloudformation stack to deploy dynamodb table and API Gateway proxy AWSTemplateFormatVersion: '2010-09-09' Resources: PipelinesDashboardApiGatewayExecutionRole: Type: AWS::IAM::Role Properties: RoleName: 'PipelinesDashboardApiGatewayExecutionRole' AssumeRolePolicyDocument: Version: '2012-10-17' Statement: - Action: ['sts:AssumeRole'] Effect: Allow Principal: Service: - apigateway.amazonaws.com Path: / Policies: - PolicyName: 'PipelinesDashboardApiGatewayExecutionRolePolicy' PolicyDocument: Version: '2012-10-17' Statement: - Action: - dynamodb:Scan Effect: Allow Resource: '*' PipelinesDashboardDynamoDbTable: Type: 'AWS::DynamoDB::Table' Properties: TableName: 'pipelines_dashboard_data' AttributeDefinitions: - AttributeName: 'PipelineName' AttributeType: 'S' KeySchema: - AttributeName: 'PipelineName' KeyType: 'HASH' ProvisionedThroughput: ReadCapacityUnits: '2' WriteCapacityUnits: '2' PipelinesDashboardStage: Type: 'AWS::ApiGateway::Stage' Properties: DeploymentId: !Ref 'ApiDeployment' Description: 'API Gateway for Pipelines Dashboard' MethodSettings: - ResourcePath: / HttpMethod: GET RestApiId: !Ref 'DashboardApi' StageName: 'data' ApiDeployment: Type: 'AWS::ApiGateway::Deployment' DependsOn: GetPipelinesDetailsMethod Properties: RestApiId: !Ref DashboardApi GetPipelinesDetailsMethod: Type: "AWS::ApiGateway::Method" Properties: AuthorizationType: 'NONE' HttpMethod: 'GET' Integration: Type: 'AWS' IntegrationHttpMethod: 'POST' Uri: !Sub 'arn:aws:apigateway:${AWS::Region}:dynamodb:action/Scan' PassthroughBehavior: 'WHEN_NO_TEMPLATES' Credentials: !Sub 'arn:aws:iam::${AWS::AccountId}:role/PipelinesDashboardApiGatewayExecutionRole' IntegrationResponses: - ResponseTemplates: application/json: "#set($inputRoot = $input.path('/div>))\n[\n #foreach($elem\ \ in $inputRoot.Items) {\n \"name\": \"$elem.PipelineName.S\"\ ,\n \"status\": \"$elem.PipelineStatus.S\",\n \"stage\"\ : null\n }#if($foreach.hasNext),#end\n\ \t#end\n\n]" StatusCode: '200' RequestTemplates: application/json: "#set($inputRoot = $input.path('/div>))\n{ \"TableName\"\ : \"pipelines_dashboard_data\" }" OperationName: 'GetPipelinesData' ResourceId: !GetAtt 'DashboardApi.RootResourceId' RestApiId: !Ref 'DashboardApi' MethodResponses: - StatusCode: '200' DashboardApi: Type: 'AWS::ApiGateway::RestApi' Properties: Name: 'PipelinesDashboardApi' Description: 'API used for Pipelines Dashboard requests'