Run any Python function on 1000 computers in 1 second.
Burla is the simplest way to scale python, it has one function: remote_parallel_map
It's open-source, works with GPU's, custom docker containers, and up to 10,000 CPU's at once.

A data-platform any team can learn in minutes:
Scale machine learning systems, or other research efforts without weeks of onboarding or setup. Burla is open-source and can be depoloyed in your cloud with a single command.

How it works:
Burla only has one function:
from burla import remote_parallel_map
my_inputs = [1, 2, 3]
def my_function(my_input):
print("I'm running on my own separate computer in the cloud!")
return my_input
return_values = remote_parallel_map(my_function, my_inputs)
With Burla, running code in the cloud feels the same as coding locally:
Anything you print appears in your local terminal.
Exceptions thrown in your code are thrown on your local machine.
Your local python packages are automatically synchronized with the cluster.
Responses are pretty quick, you can call a million simple functions in a couple seconds!
Attach big hardware to functions that need it:
Zero config files, just simple arguments like func_cpu
& func_ram
.
from xgboost import XGBClassifier
def train_model(hyper_parameters):
model = XGBClassifier(n_jobs=64, **hyper_parameters)
model.fit(training_inputs, training_targets)
remote_parallel_map(train_model, parameter_grid, func_cpu=64, func_ram=256)
Simple, flexible pipelines:
Nest remote_parallel_map
calls to build simple, massively parallel pipelines.
Use background=True
to schedule function calls that keep running after you close your laptop.
from burla import remote_parallel_map
def process_record(record):
# Pretend this does some math per-record!
return result
def process_file(file):
results = remote_parallel_map(process_record, split_into_records(file))
upload_results(results)
def process_files(files):
remote_parallel_map(process_file, files, func_ram=16)
remote_parallel_map(process_files, [files], background=True)
Run code in any Docker image, using the latest GPU's:
Public or private, just paste a URI to your image and hit start. Burla works with any linux based Docker image.

Watch our Demo:
Get started now:
Questions? Schedule a call, or email [email protected]. We're always happy to talk.