Skip to main content

Django grouping logs belonging to one request

Python has a very robust and easy to use logging framework. Django being a Python based web framework uses same logging as Python. To see how to implement logging in Django applications please follow this link of Django's official documentation site. In case you want me to explain how to implement logger in Django, feel free to put up a message in comments section and I will write a blog on it.

Assuming you already have basic knowledge of implementing logger in Django, I will move forward with tutotrial. One problem I faced with logger of Django is tracing logs of a request through different apps of django.

Django request response cycle
Request Response Cycle Django


A typical request response cycle in Django would look like the image above. Now when a request hits middleware you may want to log something after that flow goes to application and in application there could be many modules through which our code flows. Now assume for a particular request we want to check what has happened through logs, this thing could be very difficult to find in logs as we don't have any unique identifier for a request in our logs. 

One obvious solution is to add a extra unique identifier in Django logs, lets see how to do it. From above image we see that whenever we get a request it first goes through middleware and then to our django app and again to middleware. Also every request response cycle is executed in a single thread. We can use this to our advantage. Whenever we get a request in our middleware we generate a unique identifier for it and assign it to our thread name and in our logger format we print the thread name. Now for that particualar request the name of thread will be your unique identifier and we can easily trace our request with that unique identifier.

# file:: test_logger.py
import logging
def getLogger():
 logger = logging.getLogger('test')
 formatter = logging.Formatter("%(time)-15s [%(threadName)s] %(filename)s" [%(levelname)s] %(message)s")
 handler = logging.FileHandler('log.txt', mode='a')    handler.setFormatter(formatter)
 logger.setLevel(logging.DEBUG)
 logger.addHandler(handler)
 return logger


# file:: middleware.py
from test_logger import getLogger
from threading import currentThread
import uuid # we will be using this to generate unique identifier
class LoggerMiddleware(object):
  logger = getLogger()
  def process_request(self, request):
    currentThread().setName(uuid.uuid4())
    logger.debug('Request log')
  
  def process_response(self, request, response):
    logger.debug('Response log')


# file:: views.py
from test_logger import getLogger
logger = getLogger()
class TestView():
  def post(self, reqest, *args, **kwargs):
    logger.debug('Application log')

# file:: log.txt
2017-09-01 00:47:27,793 [61b20f5c-c0ed-4071-ab7c-a43e4c5f8139] middleware.py Request log
2017-09-01 00:47:27,793 [61b20f5c-c0ed-4071-ab7c-a43e4c5f8139] views.py Application log
2017-09-01 00:47:27,796 [61b20f5c-c0ed-4071-ab7c-a43e4c5f8139] middleware.py Response log


The above log file has one parameter [61b20f5c-c0ed-4071-ab7c-a43e4c5f8139] common across all files. We can use this trace our request.

Comments

Popular posts from this blog

Python Generators use cases

Hi Friends recently I have been studying generators in python. During my course of study I found few really cool use cases for use of generators. In this post I will be describing few use cases where generators could be useful. Eager Parsing of List Suppose we have a function which does some computation and returns a set of data. In real world scenario we could relate it to a function which does some database operation and returns a list of entry from database. Below is the sample code. # Scenario 1 without generators from time import sleep def load_data: data = list() for i in range(10): data.append(i) sleep(0.5) return data data = load_data() for _ in data: if _ > 5: break print _ Above code calls load_data function, the function above iterates 10 times and in each iteration it sleeps for 0.5 seconds. So total time taken by function is 5 seconds. Now if see our code after the call to load_data we are looping throug