Skip to content

Commit

Permalink
Merge tag 'perf-core-for-mingo-4.14-20170816' of git://git.kernel.org…
Browse files Browse the repository at this point in the history
…/pub/scm/linux/kernel/git/acme/linux into perf/core

Pull perf core improvements and fixes:

New features:

- Support exporting Intel PT data to sqlite3 with python perf scripts,
  this is in addition to the postgresql support that was already there (Adrian Hunter)

Infrastructure changes:

- Handle perf tool builds with less features in perf shell tests, such
  as those with NO_LIBDWARF=1 or even without 'perf probe' (Arnaldo Carvalho de Melo)

- Replace '|&' with '2>&1 |' to work with more shells in the just
  introduced perf test shell harness (Kim Phillips)

Architecture related fixes:

- Fix endianness problem when loading parameters in the BPF prologue
  generated by perf, noticed using 'perf test BPF' in s390x systems (Wang Nan, Thomas Richter)

Signed-off-by: Arnaldo Carvalho de Melo <[email protected]>
Signed-off-by: Ingo Molnar <[email protected]>
  • Loading branch information
Ingo Molnar committed Aug 17, 2017
2 parents 927d2c2 + 35435cd commit 9881223
Show file tree
Hide file tree
Showing 14 changed files with 611 additions and 47 deletions.
6 changes: 3 additions & 3 deletions tools/perf/Documentation/intel-pt.txt
Original file line number Diff line number Diff line change
Expand Up @@ -104,9 +104,9 @@ system, asynchronous, interrupt, transaction abort, trace begin, trace end, and
in transaction, respectively.

While it is possible to create scripts to analyze the data, an alternative
approach is available to export the data to a postgresql database. Refer to
script export-to-postgresql.py for more details, and to script
call-graph-from-postgresql.py for an example of using the database.
approach is available to export the data to a sqlite or postgresql database.
Refer to script export-to-sqlite.py or export-to-postgresql.py for more details,
and to script call-graph-from-sql.py for an example of using the database.

There is also script intel-pt-events.py which provides an example of how to
unpack the raw data for power events and PTWRITE.
Expand Down
8 changes: 8 additions & 0 deletions tools/perf/scripts/python/bin/export-to-sqlite-record
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
#!/bin/bash

#
# export perf data to a sqlite3 database. Can cover
# perf ip samples (excluding the tracepoints). No special
# record requirements, just record what you want to export.
#
perf record $@
29 changes: 29 additions & 0 deletions tools/perf/scripts/python/bin/export-to-sqlite-report
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
#!/bin/bash
# description: export perf data to a sqlite3 database
# args: [database name] [columns] [calls]
n_args=0
for i in "$@"
do
if expr match "$i" "-" > /dev/null ; then
break
fi
n_args=$(( $n_args + 1 ))
done
if [ "$n_args" -gt 3 ] ; then
echo "usage: export-to-sqlite-report [database name] [columns] [calls]"
exit
fi
if [ "$n_args" -gt 2 ] ; then
dbname=$1
columns=$2
calls=$3
shift 3
elif [ "$n_args" -gt 1 ] ; then
dbname=$1
columns=$2
shift 2
elif [ "$n_args" -gt 0 ] ; then
dbname=$1
shift
fi
perf script $@ -s "$PERF_EXEC_PATH"/scripts/python/export-to-sqlite.py $dbname $columns $calls
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
#!/usr/bin/python2
# call-graph-from-postgresql.py: create call-graph from postgresql database
# Copyright (c) 2014, Intel Corporation.
# call-graph-from-sql.py: create call-graph from sql database
# Copyright (c) 2014-2017, Intel Corporation.
#
# This program is free software; you can redistribute it and/or modify it
# under the terms and conditions of the GNU General Public License,
Expand All @@ -11,18 +11,19 @@
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.

# To use this script you will need to have exported data using the
# export-to-postgresql.py script. Refer to that script for details.
# To use this script you will need to have exported data using either the
# export-to-sqlite.py or the export-to-postgresql.py script. Refer to those
# scripts for details.
#
# Following on from the example in the export-to-postgresql.py script, a
# Following on from the example in the export scripts, a
# call-graph can be displayed for the pt_example database like this:
#
# python tools/perf/scripts/python/call-graph-from-postgresql.py pt_example
# python tools/perf/scripts/python/call-graph-from-sql.py pt_example
#
# Note this script supports connecting to remote databases by setting hostname,
# port, username, password, and dbname e.g.
# Note that for PostgreSQL, this script supports connecting to remote databases
# by setting hostname, port, username, password, and dbname e.g.
#
# python tools/perf/scripts/python/call-graph-from-postgresql.py "hostname=myhost username=myuser password=mypassword dbname=pt_example"
# python tools/perf/scripts/python/call-graph-from-sql.py "hostname=myhost username=myuser password=mypassword dbname=pt_example"
#
# The result is a GUI window with a tree representing a context-sensitive
# call-graph. Expanding a couple of levels of the tree and adjusting column
Expand Down Expand Up @@ -160,7 +161,7 @@ def selectCalls(self):
'( SELECT short_name FROM dsos WHERE id = ( SELECT dso_id FROM symbols WHERE id = ( SELECT symbol_id FROM call_paths WHERE id = call_path_id ) ) ), '
'( SELECT ip FROM call_paths where id = call_path_id ) '
'FROM calls WHERE parent_call_path_id = ' + str(self.call_path_id) + ' AND comm_id = ' + str(self.comm_id) + ' AND thread_id = ' + str(self.thread_id) +
'ORDER BY call_path_id')
' ORDER BY call_path_id')
if not ret:
raise Exception("Query failed: " + query.lastError().text())
last_call_path_id = 0
Expand Down Expand Up @@ -291,29 +292,40 @@ def __init__(self, db, dbname, parent=None):

if __name__ == '__main__':
if (len(sys.argv) < 2):
print >> sys.stderr, "Usage is: call-graph-from-postgresql.py <database name>"
print >> sys.stderr, "Usage is: call-graph-from-sql.py <database name>"
raise Exception("Too few arguments")

dbname = sys.argv[1]

db = QSqlDatabase.addDatabase('QPSQL')

opts = dbname.split()
for opt in opts:
if '=' in opt:
opt = opt.split('=')
if opt[0] == 'hostname':
db.setHostName(opt[1])
elif opt[0] == 'port':
db.setPort(int(opt[1]))
elif opt[0] == 'username':
db.setUserName(opt[1])
elif opt[0] == 'password':
db.setPassword(opt[1])
elif opt[0] == 'dbname':
dbname = opt[1]
else:
dbname = opt
is_sqlite3 = False
try:
f = open(dbname)
if f.read(15) == "SQLite format 3":
is_sqlite3 = True
f.close()
except:
pass

if is_sqlite3:
db = QSqlDatabase.addDatabase('QSQLITE')
else:
db = QSqlDatabase.addDatabase('QPSQL')
opts = dbname.split()
for opt in opts:
if '=' in opt:
opt = opt.split('=')
if opt[0] == 'hostname':
db.setHostName(opt[1])
elif opt[0] == 'port':
db.setPort(int(opt[1]))
elif opt[0] == 'username':
db.setUserName(opt[1])
elif opt[0] == 'password':
db.setPassword(opt[1])
elif opt[0] == 'dbname':
dbname = opt[1]
else:
dbname = opt

db.setDatabaseName(dbname)
if not db.open():
Expand Down
5 changes: 3 additions & 2 deletions tools/perf/scripts/python/export-to-postgresql.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@
# pt_example=# \q
#
# An example of using the database is provided by the script
# call-graph-from-postgresql.py. Refer to that script for details.
# call-graph-from-sql.py. Refer to that script for details.
#
# Tables:
#
Expand Down Expand Up @@ -340,7 +340,8 @@ def do_query(q, s):
'to_sym_offset bigint,'
'to_ip bigint,'
'branch_type integer,'
'in_tx boolean)')
'in_tx boolean,'
'call_path_id bigint)')
else:
do_query(query, 'CREATE TABLE samples ('
'id bigint NOT NULL,'
Expand Down
Loading

0 comments on commit 9881223

Please sign in to comment.