JAX + Flower for Federal Learning gives machine learning researchers the flexibility to use an in-depth learning framework for their projects.

Google researchers built JAX to perform NumPy calculations on GPUs and TPUs. DeepMind uses it to help and accelerate their research and it is gaining popularity. Differences with grad (), vectorization with map () and JIT-compilation (in time) with jit are some of the composite functions for machine learning studies in JAX (). As a result, adding JAX workload to Flower code samples is mandatory. The combination of JAX and Flower allows ML and FL researchers to apply the in-depth study framework required by their projects. The updated code model now serves as a template for transferring existing JAX projects to the federated environment.

The deployment of a centralized machine learning architecture is very simple and the JAX developer documentation has many examples. Because the parameters of the ML model are stored in the DeviceArray data format, installing a federated load requires a certain amount of JAX knowledge. To be compatible with Flower NumPyClient, those arguments must be converted to NumPy ndarrays. JAX meets with the Flower example, which shows below how Flower settings can work.

Let’s start with creating a very simple JAX learning environment. To create a random regression problem, the jax_training.py file uses a set of linear regression data from scikit-learn. The data is loaded using the data load function (). Model () defines a simple linear regression model, while train () and evaluate () define the learning and evaluation process of the studied model, respectively. Loss fn () is an additional function for calculating losses and it is differentiated using the JAX differential function defined grad ().

def main():

    # Load training and validation data
    X, y, X_test, y_test = load_data()
    model_shape = X.shape[1:]

    # Defining the loss function 
    grad_fn = jax.grad(loss_fn)

    # Loading the linear regression model
    params = load_model(model_shape)   

    # Start model training based on training set
    params, loss, num_examples = train(params, grad_fn, X, y)
    print("Training loss:", loss)

    # Evaluate model (loss)
    loss, num_example = evaluation(params, grad_fn, X_test, y_test)
    print("Evaluation loss:", loss)

The server sends the global model parameters to a set of randomly selected clients, the clients teach the model parameters in their local data, they return the updated model parameters to the server, and the server collects updates of the parameters received from the clients to obtain. a new global model (hopefully improved). This is an example of a federation study cycle that is repeated as long as the model is close to each other.

By default, the Flower server uses FedAvg’s basic technique to collect model parameter changes that it receives from customers. The new global model will be delivered to the next group of randomly selected clients based on the complex parameters of the model to start the next round of federation training.

To do this, simply reuse the jax_training.py methods to complete the local training on each client before federating it with Flower. The federation’s training client code is described below.

To get started, enter all the necessary packages. Flowers (flwr package), NumPy and Jax are three:

import flwr as fl
import numpy as np
import jax
import jax.numpy as jnp

from typing import Dict, List, Tuple

import jax_training

The main function of Client.py is very similar to the centralized example. After loading the data and creating the model, the Flower client starts with the local model and data.

def main() -> None:
    """Load data, start NumPyClient."""

    # Load data
    train_x, train_y, test_x, test_y = jax_training.load_data()

    # Define the loss function
    grad_fn = jax.grad(jax_training.loss_fn)

    # Load model (from centralized training) and initialize parameters
    model_shape = train_x.shape[1:]
    params = jax_training.load_model(model_shape)

    # Start Flower client
    client = FlowerClient(params, grad_fn, train_x, train_y, test_x, test_y)
    fl.client.start_numpy_client("0.0.0.0:8080", client)


if __name__ == "__main__":
    main()

FlowerClient is a glue code that allows Flower to call a regular routine of learning and evaluation by linking the local model and data to the Flower framework. When a client is launched (by running a client or starting numpy client), it establishes a connection with the server, waits for messages from the server, processes these messages using FlowerClient methods, and then returns the results to the server for summarization.

Obtaining parameters (), set parameters (), fit () and evaluate are the four methods required to implement the Flower () client. Use the get parameters () function to collect locally defined model parameters. It should be noted that in order to link the local model parameters to the Flower server and start the server aggregation process, the JAX parameters from DeviceArrays must be converted to NumPy ndarrays using np.array ().

The aggregation method takes the average of the aggregated parameters and applies it to the parameters of the global model. The next set of clients accepts the modified global model parameters, and the set () parameter updates the local model parameters in those clients. After a period of training, the evaluation procedure begins. The single round of federation training is now over.

class FlowerClient(fl.client.NumPyClient):
    """Flower client implementing linear regression using JAX"""

    def __init__(
        self,
        params: Dict,
        grad_fn: Callable,
        train_x: List[np.ndarray],
        train_y: List[np.ndarray],
        test_x: List[np.ndarray],
        test_y: List[np.ndarray],
    ) -> None:
        self.params = params
        self.grad_fn = grad_fn
        self.train_x = train_x
        self.train_y = train_y
        self.test_x = test_x
        self.test_y = test_y

    def get_parameters(self):
        # Return model parameters as a list of NumPy ndarrays
        parameter_value = []
        for _, val in self.params.items():
            parameter_value.append(np.array(val))
        return parameter_value
    
    def set_parameters(self, parameters: List[np.ndarray]) -> None:
        # Collect model parameters and update the parameters of the local model
        value=jnp.ndarray
        params_item = list(zip(self.params.keys(),parameters))
        for item in params_item:
            key = item[0]
            value = item[1]
            self.params[key] = value
        return self.params
    
    def fit(
        self, parameters: List[np.ndarray], config: Dict
    ) -> Tuple[List[np.ndarray], int, Dict]:
        # Set model parameters, train model, return updated model parameters
        print("Start local training")
        self.params = self.set_parameters(parameters)
        self.params, loss, num_examples = jax_training.train(self.params, self.grad_fn, self.train_x, self.train_y)
        results = {"loss": float(loss)}
        print("Training results", results)
        return self.get_parameters(), num_examples, results

    def evaluate(
        self, parameters: List[np.ndarray], config: Dict
    ) -> Tuple[float, int, Dict]:
        # Set model parameters, evaluate model on local test dataset, return result
        print("Start evaluation")
        self.params = self.set_parameters(parameters)
        loss, num_examples = jax_training.evaluation(self.params,self.grad_fn, self.test_x, self.test_y)
        print("Evaluation accuracy & loss", loss)
        return (
            float(loss),
            num_examples,
            {"loss": float(loss)},
        )

With server.py, the Flower server can now be installed.

import flwr as fl

if __name__ == "__main__":
    fl.server.start_server("0.0.0.0:8080", config={"num_rounds": 3})

open a terminal window and type:

$ python server.py

By opening a new terminal and typing: start the first client:

$ python client.py

Finally, start the second client by opening a new terminal:

$ python client.py

Flowers are used to federate a pre-centralized example of JAX. To enable Flower to manage the complexity of federation learning, you only need to convert the JAX model parameters to NumPy ndarrays and the NumPyClient subclass.

Uploading different data points to each customer, opening new customers or even setting different tactics are examples of other paradigms.

Check out the Advanced TensorFlow pattern to get a deeper look at the Flower features.

Source: https://flower.dev/blog/2022-03-22-jax-meets-flower-federated-learning-with-jax/

Leave a Comment