nimblebox logo
Docs
v3.3.1b

nbox.operator

Operators is how you write NBX-Jobs. If you are familiar with pytorch, then usage is exactly same, for others here's a quick recap:

class MyOperator(Operator):
  def __init__(self, a: int, b: str):
    super().__init__()
    self.a: int = a
    self.b: Operator = MyOtherOperator(b) # nested calling
  
  def forward(self, x: int) -> int:
    y = self.a + x
    y = self.b(y) + x # nested calling
    return y

job = MyOperator(1, "hello") # define once
res = job(2)                 # use like python, screw DAGs

We always wanted to ensure that there is least developer resistance in the way of using Operator so there is a convinient operator decorator that can wrap any function or class and extend all the powerful methods available in the Operator object, like .deploy(). By default every wrapped function is run as a Job.

@operator()
def foo(i: float = 4):
  return i * i

# to deploy the operator 
if __name__ == "__main__":
  # pass deployment_type = "serving" to make an API
  foo_remote = foo.deploy('workspace-id')
  assert foo_remote() == foo()
  assert foo_remote(10) == foo(10)

And you can make simple stateful object like classes using @operator decorator by making it an API endpoint.

@operator()
class Bar:
  def __init__(self, x: int = 1):
    self.x = x

  def inc(self):
    self.x += 1

  def getvalue(self):
    return self.x

  def __getattr__(self, k: str):
    # simple echo to demonstrate that underlying python object methods can
    # also be accessed over the internet
    return str

if __name__ == "__main__":
  bar_remote = Bar.deploy('workspace-id')
  
  # increment the numbers
  bar.inc(); bar_remote.inc()

  # directly access the values, no need for futures
  assert bar.x == bar_remote.x

  print(bar.jj_guverner, bar_remote.jj_guverner)

If you want to use the APIs for deployed jobs and servings (nbox.Jobs)[nbox.jobs.html] is a better documentation.

Engineering

Fundamentally operators act as a wrapper on user code, sometime abstracting away functions by breaking them into __init__s and forwards. But this is a simpler way to wrap user function than letting users wrap their own function. It is easy to get false positives, and so we explicitly expand things in two. These operators are like torch.nn.Modules spiritually as well because modules manage the underlying weights and operators manage the underlying user logic.

Operators are combination of several subsystems that are all added in the same class, though certainly if we come up with that high abstraction we will refactor this:

  • tree: All operators are really treated like a tree meaning that the execution is nested and the order of execution is determined by the order of the operators in the tree. DAGs are fundamentally just trees with some nodes spun together, to execute only once.
  • deploy, ...: All the services in NBX-Jobs.
  • get_nbx_flow: which is the static code analysis system to understand true user intent and if possible (and permission of the user) optimise the logic.

Tips

Operators are built to be the abstract equivalent of any computation so code can be easily run in distributed fashion.

  • Use Operator directly as a function as much as possible, it's the simplest way to use it.
  • @operator decorator on your function and it will be run as a job by default, you want that.
  • @operator decorator on your class and it will be run as a serving by default, you want that.

Documentation

Classes

iconclassOperator
[source]
iconfunction__init__
[source]

Create an operator, which abstracts your code into sharable, bulding blocks which can then deployed on either NBX-Jobs or NBX-Deploy.

Example:

Python
Loading...
iconfunction__remote_init__
[source]

User can overwrite this function, this will be called only when running on remote. This helps in with things like creating the models can caching them in self, instead of lru_cache in forward.

iconfunctionremote_init
[source]

Triggers __remote_init__ across the entire tree.

iconfunction__repr__
[source]
iconfunctionfrom_job
[source]
(job_name:str,job_id:str)
Parameters
  • job_name -

    Name of the job. Defaults to "".

  • job_id -

    ID of the job. Defaults to "".

latch an existing job so that it can be called as an operator. This is designed to be used as a part of "Compute Fabric" mode.

iconfunctionfrom_serving
[source]
(url:str,token:str)
Parameters
  • url -

    The URL of the serving

  • token -

    The token to access the deployment, get it from settings.

Latch to an existing serving operator

iconfunctionfn
[source]

Wraps a function or class as an Operator, so you can use all the same methods as Operator

iconfunction_cls
[source]

Do not use directly, use @operator decorator instead. Utility to wrap a class as an operator

iconfunctionfrom_class
[source]

Wraps an initialised class as an operator, so you can use all the same methods as Operator

iconfunction_fn
[source]

Do not use directly, use @operator decorator instead. Utility to wrap a function as an operator

iconfunctionfrom_fn
[source]

Wraps a function as an operator, so you can use all the same methods as Operator

iconfunction__setattr__
[source]
iconfunction__getattr__
[source]
iconfunction__getitem__
[source]
iconfunction__setitem__
[source]
iconfunction__delitem__
[source]
iconfunction__iter__
[source]
iconfunction__next__
[source]
iconfunction__len__
[source]
iconfunction__contains__
[source]
iconfunctionpropagate
[source]

Set kwargs for each child in the Operator

iconfunctionthaw
[source]

Load JobProto into this Operator

iconfunction_named_operators
[source]

Returns an iterator over all modules in the network, yielding both the name of the module as well as the module itself.

iconfunction__call__
[source]
iconfunctionforward
[source]
iconfunction_get_dag
[source]

Get the DAG for this Operator including all the nested ones.

iconfunctiondeploy
[source]
(group_id:str,deployment_type:str,resource:Resource,ignore_patterns:List[str])
Parameters
  • group_id -

    The Job/Deploy id to deploy to.

  • deployment_type -

    The deployment type. Defaults to None.

  • resource -

    The resource to use for deployment. Defaults to None.

  • ignore_patterns -

    The patterns to ignore. Defaults to [].

Uploads relevant files to the cloud and deploys as a batch process or and API endpoint, returns the relevant .from_job() or .from_serving Operator. This uploads the entire folder where the caller file is located. In which case having a .nboxignore and requirements.txt will be honoured.

iconfunctionmap
[source]

Take the same logic and apply it to a list of inputs, different from star_map in that it takes in different logic and applied different inputs. Returns results in the same order as inputs.

In the current version runs as many workers as are the inputs.

nbox SDKnbox provides built in access to all the APIs and packages them in the most user friendly manner.Star 0