-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.xml
32 lines (28 loc) · 2.11 KB
/
index.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
<channel>
<title>Antonio Bares</title>
<link>https://agbares.com/</link>
<description>Recent content on Antonio Bares</description>
<generator>Hugo -- gohugo.io</generator>
<language>en-us</language>
<lastBuildDate>Sat, 26 Feb 2022 22:53:30 -0800</lastBuildDate><atom:link href="https://agbares.com/index.xml" rel="self" type="application/rss+xml" />
<item>
<title>Spark Resource Calculator</title>
<link>https://agbares.com/blog/spark-resource-calculator/</link>
<pubDate>Sat, 26 Feb 2022 22:53:30 -0800</pubDate>
<guid>https://agbares.com/blog/spark-resource-calculator/</guid>
<description>After optimizing a few spark jobs, I realized that calculating the executor memory space is an extremely manual process.
I decided to write a simple UI that calculates an executor&rsquo;s on-heap memory space (e.g. spark/executor/storage memory). I might extend this in the future to also include the executor container&rsquo;s entire memory space (e.g. executor memory, overhead, and off-heap).
You can find the calculator by heading to agbares.com/spark-resource-calculator-ui</description>
</item>
<item>
<title>Spark Executor Memory (Pyspark)</title>
<link>https://agbares.com/blog/spark-executor-memory/</link>
<pubDate>Sun, 06 Feb 2022 17:35:01 -0800</pubDate>
<guid>https://agbares.com/blog/spark-executor-memory/</guid>
<description>Over the past year, I&rsquo;ve been building a fair amount of Spark ETL pipelines at work (via pyspark). The complexity of the pipelines I build have been growing. Evidently, this complexity required a better understanding in Spark&rsquo;s inner workings.
After a lot of reading, youtube videos, and docs, I think I have a beter grasp on Spark&rsquo;s memory model. A lot of the information online can be quite confusing. And frankly, incorrect or out of date.</description>
</item>
</channel>
</rss>