Python
Proxy Web Requests
This code allows you to make web requests to a single web server through a pre-made list of proxies, in a 'round robin' manner. The proxies are read from an excel spreadsheet so you can use as many proxies as you like (or can find).
I wrote the code in Python and it uses openpyxl to read in the proxy list. The actual requests are sent through the command line using cURL and I wrote and tested this on the Raspberry Pi OS. Finding the proxies is simply, just google 'Free Proxy List' to find plenty.
To begin, set up the spreadsheet in the following way:
List out the proxies in the first column in this format: ProxyIP:Port with no spaces before or after. The first thing the code does is check row-by-row for the first empty cell in column 'A' and take that as the stopping point. Make sure the spreadsheet is placed in the same directory as the .py file too.
The code is looking for a file named 'ProxyList.xlsx' so name your proxy list exactly that. Remember that in linux file paths and names are case sensitive, i.e. 'ProxyList.xlsx' and 'proxylist.xlsx' are seen as two completely different files. Alternatively, you can change the file name in the Python code.
Here's the accompanying Python, I'm using this website as the target web server:
import os import time from openpyxl import load_workbook from sys import exit cmdstart = 'sudo curl -x ' cmdend = ' https://dingbotcode.com/ -m 30' lastrow = 1048576 filename = "ProxyList.xlsx" workbook = load_workbook(filename) sheet = workbook.active def sendreq(): if sheet.cell(row=1,column=1).value == None: print('No proxies found') exit() for r in range(1, lastrow): if sheet.cell(row=r,column=1).value == None: qtyofproxies = r -1 print('Last non-blank row is ' + str(qtyofproxies)) break for i in range(1, qtyofproxies): cmd = cmdstart + sheet.cell(row=i,column=1).value + cmdend print('\n***(' + str(i) + ')***\n' + cmd) os.system(cmd) time.sleep(3) if __name__ == "__main__": sendreq()
It runs through the proxy list from the top row to bottom and sends the request through each separate proxy server. I've set the request to timeout after 30 seconds as proxy servers can be unreliable and may go up or down at any moment.