Office Address

43, Panerion ki Madri, Udaipur,
Rajasthan, India

Phone Number

+91 70140-40648

Email Address

contact@zenlogiclabs.com

hello@zenlogiclabs.com

Rendering Large Tabular Data in the Browser

Introduction

HTML is the go-to solution for displaying table-like data. It offers an exhaustive formatting model that prevents the need to reinvent the wheel. However, when dealing with large datasets, there are several issues to consider that could affect performance.

This article is aimed at technical leads who are silently judged for poor UI performance. Let’s dive into the problems associated with rendering large tabular data in the browser and explore potential solutions.

Challenges with Large Tabular Data

When displaying large or potentially large datasets, several performance problems can arise, such as:

  • Large numbers of DOM nodes can hinder performance.
  • Large numbers of DOM nodes can incur noticeable reflow costs.

Even if your high-end MacBook M3 doesn’t feel the impact, users with lower-powered devices may experience significant slowdowns. At some point, large software organizations realize this and take action. This usually leads to a meeting with designers and front-end developers to discuss solutions like:

  • Reducing the number of DOM nodes and expensive CSS rules.
  • Limiting or eliminating animations.
  • Restricting the ability for users to scroll.

Profiling Tools are Your Friend

Serious HTML projects are bound by many restraints, and as a developer, you’ll need to rely heavily on profiling tools to ensure performance. One solution to the ‘large number of DOM nodes’ problem is implementing list virtualisation, which loads large datasets into memory but only renders parts of the DOM relative to the user’s view. This technique is similar to occlusion culling in video games.

Beyond Virtualization: The Limitations of HTML and CSS

Even with list virtualisation, reflow issues can still affect the smoothness of animations. Let’s be honest: HTML and CSS are long overdue for a successor. However, until new technologies like WebGPU become widely adopted, we need to work with what we have. GPU-based solutions, such as Canvas 2D and WebGL, offer alternatives, but the talent pool for these technologies is limited.
Three.js helps bridge this gap for 3D rendering, but for most, it’s still a niche area.

A Radical UI Strategy: Replacing DOM with Images

Here’s a radical idea: replace large parts of your DOM with images to reduce node count and improve performance. This strategy involves:

  • Low node count: An image can reduce the number of DOM nodes on the site.
  • Reduced reflow: Images handle reflow much better than nested DOM trees.
  • Dynamic generation: You can generate images dynamically in the browser using canvas or offscreen-canvas, and serve them using BlobURLs.
  • Low memory usage: Image formats like jpg, png, webp, and avif consume less memory compared to equivalent DOM nodes.

However, there are several drawbacks to this approach:

  • Poor interactivity and accessibility.
  • Inability to select or copy data from the image.
  • Requires high resolution to handle different device-pixel-ratios or scaling.

Fluid Large-Scale Tabular Data Strategy

To handle large datasets fluidly, you can use CSS 3D transforms to create compositing layers that render the DOM at fluent GPU speeds. While this doesn’t entirely eliminate reflow, it offers better performance during motion. The strategy involves:

  • Fetching data and rendering it using offscreen-canvas: The canvas context is stateful, so you’ll need to manage it carefully.
  • Rendering the canvas to images: Use BlobURLs and replace your complex DOM UI with images during periods of motion.
  • Adding an interactive DOM overlay: Create a UI overlay to represent the portion of data currently in view. This overlay can fade out during animations and reappear afterward.

Is It Worth It?

This approach is definitely overkill for typical applications or moderate amounts of data. However, if you’re struggling with performance, it might be worth considering. Remember to weigh the image size and memory cost before implementing this solution.
For example, using 10 avif images, each measuring 2000 x 1000 pixels, could result in around 80 MB of long-term heap memory usage. While this is reasonable, you’ll need to evaluate whether the trade-offs are worth it for your app.

Conclusion

In the end, this strategy is a more ambitious take on list virtualisation. By rendering large datasets fluidly and efficiently, you can create a smooth user experience while giving the illusion of thousands or millions of DOM nodes flying through the page.