Skip to content

Commit

Permalink
docs: update faq
Browse files Browse the repository at this point in the history
  • Loading branch information
vikiboss committed Dec 13, 2024
1 parent 56fe07a commit f6d3560
Show file tree
Hide file tree
Showing 2 changed files with 44 additions and 0 deletions.
22 changes: 22 additions & 0 deletions docs/en/guide/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,3 +30,25 @@ function App() {
)
}
```

## ❓ When Handling Extremely Large Datasets (Typically Tens of Millions of Reads or More), Noticeable Lags Occur

In the vast majority of use cases, you are unlikely to encounter performance bottlenecks. However, when dealing with extremely large datasets (with tens of millions of read operations or more), performance issues may become a problem. This is mainly because, when using `Proxy`, every data access triggers the proxy's `Getter` method, thereby causing a significant performance overhead during a large number of read operations. To avoid performance issues with extremely large datasets, consider the following solutions:

- **Use useState:** Manage large datasets that don't require reactive features separately using hooks like `useState`.
- **Use ref to wrap:** Wrap large datasets with [ref](/reference/advanced/ref) to prevent them from being proxied by `Proxy`.

You can feel this performance difference intuitively by running the following code in the console:

```tsx
const obj = { name: 'Reactive' };
const proxiedObj = new Proxy(obj, {});

console.time('Normal Object Get');
for(let i = 0; i < 100_000_000; i++) obj.name;
console.timeEnd('Normal Object Get'); // ~50ms, Chrome 131, MacBook Pro (M1 Pro + 16G)

console.time('Proxied Object Get');
for(let i = 0; i = 100_000_000; i++) proxiedObj.name;
console.timeEnd('Proxied Object Get'); // ~1000ms, Chrome 131, MacBook Pro (M1 Pro + 16G)
```
22 changes: 22 additions & 0 deletions docs/zh-cn/guide/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,3 +30,25 @@ function App() {
)
}
```

## ❓ 当操作超大数据集(通常读千万次以上)时,有较为明显的卡顿 {#large-data}

在几乎绝大多数使用场景中,您可能不会遇到性能瓶颈。然而,当在处理超大数据集(千万级别以上读操作次数)时,性能问题可能会成为一个难题。这主要是由于使用 `Proxy` 时,每次数据访问都会触发代理的 `Getter` 方法,从而在进行大量的读操作时产生显著的性能开销,为避免超大数据集下的性能问题,可以考虑以下解决策略:

- **使用 useState:** 使用 `useState` 等钩子单独管理不需要响应式特性的大数据集。
- **使用 ref 包裹:** 使用 [ref](/reference/advanced/ref) 包裹大数据集以避免被 `Proxy` 代理。

你可以通过以下代码在控制台中直观地感受到这种性能差异:

```tsx
const obj = { name: 'Reactive' };
const proxiedObj = new Proxy(obj, {});

console.time('Normal Object Get');
for(let i = 0; i < 100_000_000; i++) obj.name;
console.timeEnd('Normal Object Get'); // ~50ms, Chrome 131, MacBook Pro (M1 Pro + 16G)

console.time('Proxied Object Get');
for(let i = 0; i < 100_000_000; i++) proxiedObj.name;
console.timeEnd('Proxied Object Get'); // ~1000ms, Chrome 131, MacBook Pro (M1 Pro + 16G)
```

0 comments on commit f6d3560

Please sign in to comment.