Home Retail & Consumer Facebook Using AI to Accelerate the Performance of it's Web Server

Facebook Using AI to Accelerate the Performance of it’s Web Server

Facebook must run numerous live tests in a bid to decide which configurations are ideal for its HTTP servers. It expedited the search for optimal settings by implementing a machine learning approach dubbed Bayesian optimization in a bid to minimize the list of plausible solutions.

In recent years, machine learning has been utilized in tuning the performance of machine learning itself. As such, why not utilize it in bolstering the performance of a web server?

Well, that is the standpoint held by Facebook’s researchers, who recently highlighted their work involving tweaking the servers behind the operation of the social network platform ’s server infrastructure.

The work, which was prepared by Eytan Bakshy, Guilherme, Brain Karrer, and Benjamin Letham, is available in a journal dubbed Bayesian Analysis. It is also discussed in a post that features on Facebook’s artificial intelligence (AI) research blog.

Similarly to all other Internet services, Facebook operates A/B testing in an effort to gauge the performance of servers when one or another component is altered.

If you have seen various versions of tweaked web pages including altering the layout of text, or looks of a button, then you are cognizant with this type of tweaking intended for optimizing things like shopping cart use or click-through rates, for instance, on a commercial website.

For this research by Facebook researchers, the scientists modified the options, particularly for the just-in-time compiler that allows the conversion of Python to native x86 server code. This action takes place inside the open-source web server, which aids Facebook in serving HTTP requests, the “ HipHop Virtual Machine.”

For instance, the JIT can be configured to execute actions including in-line a particular block of code. Since these adjustments can enlarge the code size, A/B testing is needed to decide whether the expediting of in-lining code complements the trade-off of taking up additional server memory.

The author utilized an approach referred to as “Bayesian analysis,” which is a type of machine learning that stresses on using past information to come up with an optimal solution.

In the past decade, Bayesian has been utilized in optimizing the “hyper-parameters” of machine learning itself including how rapid the learning rate or how big to make batch size. Since such Bayesian optimization can eliminate the drudgery associated with designing hyper-parameters, one group, for instance, referred to Bayesian optimization as a method of automating machine learning.

The Facebook authors utilized Bayesian to operate A/B tests with the settings of the JIT compiler in distinct positions. The main advantage here is speed since tests have to be conducted in a production setting in a bid to observe the impacts of different settings.

The authors wrote that in comparison to a regular A/B testing, whereby one alteration in settings is assessed at a time, the Bayesian optimization “allowed us to jointly tune more parameters with fewer experiments and find better values.”

The key word, in this case, is “jointly”: Bayesian mechanisms dismiss certain choices of settings without the need to actually them as an A/B test, primarily through extrapolating from a particular A/B test to different parameters with an effort of narrowing the number of “feasible” settings. “A test of a parameter value in a continuous space gives us information about not only its outcome but also those of nearby points,” as the authors phrased this extensive search power.

As experiments are being executed, the Bayesian model acquires new experience data with which it utilizes in narrowing down the search for optimal configurations. This endeavor allows the entire A/B testing to become increasingly efficient as it progresses.

Handling noise makes up a fresh contribution of this research, particularly with the Bayesian optimization. The authors noted that contrary to the task of optimizing machine learning networks, mainly when testing server configurations in A/B experiments, there ’s a considerable amount of noise in both the measurements of the test ’s results.

Furthermore, they pointed out that there are noisy constraints including the need to retain memory usage in a server, primarily one within reason.

The authors developed an approach for addressing such noise in their Bayesian algorithms, and they concluded that the new technique more readily generated optimal solutions compared to other types of Bayesian methods.

The interesting bit with this kind of approach to A/B testing is the fact that some settings will never be used. Since the Bayesian optimization analysis forecast which configurations out to be eliminated completely.

In fact, it will get rid of such variables from testing. As such, the authors see this as an advantage in terms of potentially minimizing the commotion of exposing users to many different experiments.

Source Zdnet

Subscribe to our newsletter

Signup today for free and be the first to get notified on the latest news and insights on artificial intelligence

KC Cheung
KC Cheung has over 18 years experience in the technology industry including media, payments, and software and has a keen interest in artificial intelligence, machine learning, deep learning, neural networks and its applications in business. Over the years he has worked with some of the leading technology companies, building and growing dynamic teams in a fast moving international environment.
- Advertisment -

MOST POPULAR

AI Model Development isn’t the End; it’s the Beginning

AI model development isn’t the end; it’s the beginning. Like children, successful models need continuous nurturing and monitoring throughout their lifecycle. Parenting is exhilarating and, if...