Beat performance monitoring hurdles in single-page interface world

This podcast discusses how frameworks like AngularJS have simplified the creation of single-page interface applications, but have created new performance monitoring challenges.

The most pervasive trend in the delivery of Web-based applications is the shift toward single-page interfaces (SPIs), a technique that delivers highly responsive, interactive content that looks slick, responds quickly and delivers on the promise of an enjoyable end-user experience.

Leading the charge in the delivery of these types of sites are three key JavaScript frameworks, namely Ember, Bootstrap and AngularJS, the most popular of the three. Interestingly, one of the big side effects of the delivery of SPI-based webpages is the manner in which server-side resources are requested, consumed and delivered to the end client. This situation creates an entirely new set of challenges in terms of application monitoring and troubleshooting performance problems, forcing vendors in the monitoring space to take a completely new approach to performance tool development and delivery.

We are taking advantage of some interesting tricks with modern HTML5 features that browsers are exposing.

Nic Jansma,
senior software engineer, SOASTA Inc.

Just think about how Web-based content delivery has changed over the past few years. Not too long ago, most webpages were simply HTML, delivered to the client using a simple request-response cycle. Once a page had fully rendered, the browser would fire off the JavaScript onload event. This makes calculating the rendering time of a webpage incredibly easy.

The only noteworthy change in the delivery of webpages to occur before the onslaught of the SPI was the use of Ajax-based interactions. However, even with asynchronous JavaScript, the content that was delivered between the client and the server would just be a snippet of hypertext or perhaps a JSON string, but nothing more complex than that, and more often than not, something much simpler. Monitoring the initiation of an Ajax-based request and clocking the corresponding response from the server wasn't a grand departure from how webpage performance was tracked using a basic request-response cycle that simply transferred HTML back and forth between the client and the server. In contrast, with new SPI frameworks, the traditional approach gets thrown aside.

Single-page interface upends page loading

With SPIs implemented using frameworks like AngularJS, the first thing a Web page does is load the framework. After the framework is loaded, the browser calls the onload function, which historically has indicated that the page has finished loading. By comparison, nothing could be further from the truth with  the popular SPI frameworks on the market. For example, with an Angular page, the onload event fires once the Angular framework has loaded, but loading Angular is just the beginning of the process. Once initialized, Angular loads a variety of submodules and then does all of the work required to assemble the data needed to render the page, a process that inevitably involves a variety of other calls to the server to obtain JSON, XML and other data feeds. Angular itself decides what should be downloaded and where it should be displayed on the page, which causes problems because "traditional monitoring tools simply don't take this into account," said Nic Jansma, a senior software engineer with SOASTA Inc.

But simply figuring out how long it takes for an SPI framework to actually render the initial page isn't the end of the journey. Webpages are now filled with soft navigation points, on which a user can transition from one rendering of the page to another without actually triggering a full request-response cycle. Just think of a Pinterest page with an infinite scroll function. A page may finish loading, but as soon as a user scrolls down an inch, traffic gets generated and additional work begins. And if a user gets really aggressive and does scroll after scroll, performance issues may arise not only on the front end, but on the back end as well.

Metrics track page loads

Fortunately, the performance monitoring challenge is a problem that has solutions, and vendors like SOASTA have been updating their products in order to keep ahead of all of the latest trends. To address the challenge of obtaining metrics on when a page has actually loaded, SOASTA develops products that go beyond monitoring the browser's onload event and instead monitor what happens after the Angular framework has loaded, taking advantage of some of the latest browser features to know when things like JavaScript, images, style sheets and other resources have been loaded. By hooking into HMTL5-based events that the latest browsers support, tools can provide a better understanding of how long a page actually takes to render in the end user's browser.

And while page load time is an important metric, how well a page performs when a user navigates within the page using what are known as "soft clicks," (i.e., clicks that don't trigger a completely new page load) is important as well. It's a new challenge monitoring "all of the other links you click within the site that don't reload a new page," said Jansma, and it is in this area where application monitoring tools must innovate and adapt.

Clearly, webpage delivery has changed, and to keep up with the latest trends, application monitoring tools have needed to change as well. To learn more about how the world of Web-based performance monitoring is changing, download the associated podcast, in which TheServerSide speaks with Jansma.

User engagement question: What features are most important to you when it comes to application monitoring? Let us know.

Dig Deeper on Front-end, back-end and middle-tier frameworks

App Architecture
Software Quality
Cloud Computing
Security
SearchAWS
Close