In a previous post I experimented with serverless monitoring of my websites. I was wondering if I could extend the monitoring functions to gather rudimentary data on the time it takes to load the site.
I decided to modify the Lambda function I used earlier to calculate the time it took for the function to read the response back from the server. This is most certainly not the best way to monitor page load times or any form of synthetic browser monitoring, but it gives me a bird’s-eye view of the trends and can alert me if for any reason this changes unexpectedly.
The python code I create is here:
Disclaimer: I am sure this can be optimised quite a bit, but it does illustrate the general idea).
import boto3 import urllib2 import socket from time import time import os def write_metric(value, metric): d = boto3.client('cloudwatch') d.put_metric_data(Namespace='Web Status', MetricData=[ { 'MetricName':metric, 'Dimensions':[ { 'Name': 'Status', 'Value': 'Page Load Time', }, ], 'Value': value, }, ] ) def check_site(url): load_time = 0.005 try: s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) s.connect((url, 443)) except socket.error, e: print("[Error:] Cannot connect to site %s" %url) return 0.005 else: print("Checking %s Page Load time" % url) start_time = time() request = urllib2.Request("http://" + url) response = urllib2.urlopen(request) output = response.read() end_time = time() response.close() load_time = round(end_time-start_time, 3) return load_time def lambda_handler(event, context): websiteurl = str(os.environ.get('websiteurl')) metricname = websiteurl + ' Page Load' r = check_site(websiteurl) print(websiteurl + " loaded in %r seconds" %r) write_metric(r, metricname)