forked from mc2-project/opaque-sql
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
3 changed files
with
109 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -116,4 +116,4 @@ Next, run Apache Spark SQL queries with Opaque as follows, assuming Spark is alr | |
|
||
## Contact | ||
|
||
If you want to know more about our project or have questions, please contact Ankur ([email protected]) and/or Wenting ([email protected]). | ||
If you want to know more about our project or have questions, please contact Wenting ([email protected]) and/or Ankur ([email protected]). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,56 @@ | ||
# Opaque demo | ||
|
||
## Setup | ||
|
||
Follow the setup instructions in README (steps 1 - 3) to launch a Spark shell with Opaque. In order to try out our attack, you should also include `--conf "spark.driver.extraJavaOptions=-Xdebug -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=8001"` when you launch Spark shell. | ||
|
||
## Usage example | ||
|
||
### Data creation | ||
|
||
```scala | ||
val data = for (i <- 0 until 10) yield ("foo", i) | ||
val rdd_data = spark.sparkContext.makeRDD(data, i) | ||
``` | ||
|
||
### DataFrame creation | ||
``` | ||
val words = spark.createDataFrame(rdd_data).toDF("word", "count") | ||
val words_e = spark.createDataFrame(rdd_data).toDF("word", "count").encrypted | ||
val words_o = spark.createDataFrame(rdd_data).toDF("word", "count").oblivious | ||
``` | ||
|
||
### Query execution | ||
|
||
``` | ||
words.filter($"count" > lit(3)).collect | ||
words_e.filter($"count" > lit(3)).collect | ||
words_o.filter($"count" > lit(3)).collect | ||
``` | ||
|
||
## Attack example | ||
|
||
### Attach jdb | ||
`jdb -sourcepath $HOME/spark/sql/core/src/main/scala:$HOME/opaque/src/main/scala -attach 8000` | ||
|
||
### Attack Spark SQL | ||
|
||
`stop at org.apache.spark.sql.execution.FilterExec$$anonfun$12$$anonfun$apply$2:126 | ||
list | ||
print row | ||
cont | ||
cont | ||
cont | ||
cont | ||
print row | ||
set r = false | ||
clear org.apache.spark.sql.execution.FilterExec$$anonfun$12$$anonfun$apply$2:126 | ||
` | ||
|
||
### Attack Opaque (works for both encryption and oblivious modes) | ||
`stop at edu.berkeley.cs.rise.opaque.execution.ObliviousFilterExec$$anonfun$executeBlocked$14:323 | ||
list | ||
dump filtered | ||
set filtered[0] = 123 | ||
cont | ||
` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,52 @@ | ||
* Opaque demo | ||
** Setup | ||
cd ~/spark | ||
source conf/spark-env.sh | ||
build/sbt package | ||
cd ~/opaque | ||
bin/spark-shell --master local[1] --jars $HOME/opaque/target/scala-2.11/opaque_2.11-0.1.jar --conf "spark.driver.extraJavaOptions=-Xdebug -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=8001" | ||
|
||
** In Spark shell | ||
*** Import | ||
import edu.berkeley.cs.rise.opaque.implicits._ | ||
edu.berkeley.cs.rise.opaque.Utils.initSQLContext(spark.sqlContext) | ||
|
||
*** Data creation | ||
val data = for (i <- 0 until 10) yield ("foo", i) | ||
val rdd_data = spark.sparkContext.makeRDD(data, 1) | ||
|
||
*** DataFrame creation | ||
|
||
val words = spark.createDataFrame(rdd_data).toDF("word", "count") | ||
val words_e = spark.createDataFrame(rdd_data).toDF("word", "count").encrypted | ||
val words_o = spark.createDataFrame(rdd_data).toDF("word", "count").oblivious | ||
|
||
*** Query execution | ||
|
||
words.filter($"count" > lit(3)).collect | ||
words_e.filter($"count" > lit(3)).collect | ||
words_o.filter($"count" > lit(3)).collect | ||
|
||
** In JDB | ||
PS1="\[\033[01;38;5;210m\]\u@\h (attacker"'!'")\[\033[00m\] \D{%Y-%m-%d %H:%M:%S} \[\033[01;34m\]\w\[\033[00m\]\n\$ " | ||
|
||
jdb -sourcepath $HOME/spark/sql/core/src/main/scala:$HOME/opaque/src/main/scala -attach 8000 | ||
|
||
*** For attacking insecure Spark | ||
stop at org.apache.spark.sql.execution.FilterExec$$anonfun$12$$anonfun$apply$2:126 | ||
list | ||
print row | ||
cont | ||
cont | ||
cont | ||
cont | ||
print row | ||
set r = false | ||
clear org.apache.spark.sql.execution.FilterExec$$anonfun$12$$anonfun$apply$2:126 | ||
|
||
*** For attempting to attack Opaque | ||
stop at edu.berkeley.cs.rise.opaque.execution.ObliviousFilterExec$$anonfun$executeBlocked$14:323 | ||
list | ||
dump filtered | ||
set filtered[0] = 123 | ||
cont |