Skip to content

Commit

Permalink
Clarify spark.cores.max
Browse files Browse the repository at this point in the history
It controls the count of cores across the cluster, not on a per-machine basis.
  • Loading branch information
ash211 committed Jan 6, 2014
1 parent a2e7e04 commit 2dd4fb5
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,8 @@ there are at least five properties that you will commonly want to control:
<td>
When running on a <a href="spark-standalone.html">standalone deploy cluster</a> or a
<a href="running-on-mesos.html#mesos-run-modes">Mesos cluster in "coarse-grained"
sharing mode</a>, how many CPU cores to request at most. The default will use all available cores
sharing mode</a>, the maximum amount of CPU cores to request for the application from
across the cluster (not from each machine). The default will use all available cores
offered by the cluster manager.
</td>
</tr>
Expand Down

0 comments on commit 2dd4fb5

Please sign in to comment.