I don't need no proxy

Published: 30 Dec 2019

For a long time, I have been looking at solving a simple problem: be more efficient when scaling vulnerability research/bug hunting.

The problem: I think it makes a lot of sense to decouple the browsing of a website from the actual fuzzing. Using a spider is not really viable in 2019^w2020, so you need a real person in front of a laptop. You can imagine that exercising all the functions of the website is done by one person (QA team, Mechanical Turk, …) and the fuzzing is then done automatically.

Until last week, my main idea revolved around the following: the person in charge of the browsing visits the website via a proxy and then send all the requests to the fuzzer. This is good but creates a delay between the browsing and fuzzing. There are multiple ways to do this:

  • Set up a proxy and get the browsing team to use it, then you get the logs.
  • Use a browser, and capture all the traffic then save it as HTTP Archives (HAR) or just copy the curl commands (one could imagine a wrapper around curl: fuzz [CURL COMMAND])

Those options are good, but there is another way I figured out last week (someone most likely already thought of it but I couldn’t find anything on it): use Chrome Debugging ( — remote-debugging-port=9222). You run Chrome in Debugging mode and get access to all the traffic in real time...

The code below illustrates a basic POC that looks for JWT:

require 'chrome_remote'
require 'base64'

chrome = ChromeRemote.client

# Enable events 
chrome.send_cmd "Network.enable"
chrome.send_cmd "Page.enable"

# Setup handler to log network requests
chrome.on "Network.requestWillBeSent" do |params|
  if params["documentURL"] =~ /eyJ.*\.eyJ.*\./
    puts "documentURL"
    puts params.inspect
  elsif params["postData"] =~ /eyJ.*\.eyJ.*\./
    puts "postData" 
    puts params.inspect
  else
    params["request"]["headers"].select{|k,v| v =~ /eyJ.*\.eyJ.*\./ }.each do |k,v|
      puts k,v
      puts params.inspect
    end
  end
  cookies = chrome.send_cmd("Network.getCookies", params["documentURL"])["cookies"]
  cookies.select{|cookie| cookie["value"]=~ /eyJ.*\.eyJ.*\./ }.each do |cookie|
    puts cookie
    puts params.inspect
  end
end

chrome.listen
view raw

Once you get the data, you can just send it via a queue to your fuzzer(s) and start attacking the application in real time.

One of the issue you run into (easy to solve) is that the Network API doesn’t give you access to the cookies as part of the headers. But you can still get access to them via Network.getCookies and add them to the request before adding the request to the queue. All together I think that is likely going to be a nice tool in a Bug Hunter arsenal.

Photo of Louis Nyffenegger
Written by Louis Nyffenegger
Founder and CEO @PentesterLab
Related Blog Post