-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
FAQ
Thanks to g40
The row object has properties which align to the column names returned from the query.
Given a table users with columns 'name' and 'age' doing select * from users
would return you a result object with an array of row objects. Each row object would have the properties name
and age
. Example:
const result = await client.query('SELECT * FROM users');
console.log('name: %s and age: %d', result.rows[0].name, result.rows[0].age);
Why, yes. Yes you can.
const result = await client.query(...);
const firstRow = result.rows[0];
for (const columnName in firstRow) {
console.log('column %o has a value of %o', columnName, firstRow[columnName]);
}
Assuming a recordset is enumerated using the array accessor style used in [1], can we get the column names in the same fashion, i.e. is there a result.rows[i].columnName
property?
This is possible using the result.fields
array:
const result = await client.query(...);
console.log("Returned columns:", result.fields.map(field => field.name));
const result = await client.query(...);
const columnCount = result.fields.length;
If pg returns query data in JSON format, for web service applications, it would make sense to return that directly to the client. If this assumption is correct what is the most efficient method?
pg returns rows as JavaScript objects by default. JSON is a text format designed to be compatible with JavaScript syntax, but JavaScript objects aren’t represented in memory as JSON, so the answer is that that’s an incorrect assumption.
Yeah, you can do this as so:
//let's pretend we have a user table with the 'id' as the auto-incrementing primary key
const sql = 'INSERT INTO users(password_hash, email) VALUES($1, $2) RETURNING id'
const result = await client.query(queryText, ['841l14yah', '[email protected]']);
const newlyCreatedUserId = result.rows[0].id;
Absolutely! The parameterized query support in node-postgres is first class. Parameters are passed to the PostgreSQL server completely separately from the SQL, ensuring proper behavior across dialects, encodings, etc... For example, this will not inject SQL:
client.query("INSERT INTO user(name) VALUES($1)", ["'; DROP TABLE user;"])
Can I create a named prepared statement for use later on without performing a query? If not, does passing the same text again to a named statement get ignored and the cached version used? I don't want to have two codepaths in a function, one for first-use and one for every other.
If a prepared statement has a name
, it is only parsed once. After that, name
will re-use the prepared statement regardless of what text
is.
Yes, here is a test that shows how it can be done. And for some examples of already registered converters, take a look at the node-pg-types project.
node-postgres supports mapping simple JavaScript arrays to PostgreSQL arrays, so in most cases you can just pass it like any other parameter.
client.query("SELECT * FROM stooges WHERE name = ANY ($1)", [ ['larry', 'curly', 'moe'] ])
Note that = ANY
is another way to write IN (...)
, but unlike IN (...)
it will work how you'd expect when you pass an array as a query parameter.
If you know the length of the array in advance you can flatten it to an IN
list:
// passing a flat array of values will work:
client.query("SELECT * FROM stooges WHERE name IN ($1, $2, $3)", ['larry', 'curly', 'moe'])
... but there's little benefit when = ANY
works with a JavaScript array.
If you're on an old version of node-postgres or you need to create more complex PostgreSQL arrays (arrays of composite types, etc) that node-postgres isn't coping with, you can generate an array literal with dynamic SQL, but be extremely careful of SQL injection when doing this. The following approach is safe because it generates a query string with query parameters and a flattened parameter list, so you're still using the driver's support for parameterized queries ("prepared statements") to protect against SQL injection:
var stooge_names = ['larry', 'curly', 'moe'];
var offset = 1;
var placeholders = stooge_names.map(function(name,i) {
return '$'+(i+offset);
}).join(',');
client.query("SELECT * FROM stooges WHERE name IN ("+placeholders+")", stooge_names)
If you have other values and placeholders in your query you'll need to use a different offset
value for the array placeholders. See #129 and #82 for extra discussion.
Why does node-postgres come with two bindings? One in JavaScript and one "native" that uses libpq? Which one is fastest and why isn't a single binding enough?
node-postgres comes with two bindings because I wrote it back before the idea of "do one tiny thing in each module" was a popular idea. I initially wrote the pure-JavaScript bindings. People were complaining about adopting them because it wasn't a C binding so it wasn't fast. To answer their critique I wrote libpq bindings. I placed them in the same module because I could reuse 70% of the tests (all of the integration tests) so I could quickly know when the APIs diverged.
note: sometime after v1.0 I plan on splitting the JavaScript, native, and integration tests into their own modules. the node-postgres module itself will be a sort of 'meta package' for the other modules
Last time I checked the native bindings were faster than the pure JavaScript bindings; however, there are performance gains still available to both through code refactors and this can/will change. Either binding you use is fast enough to not end up being a significant factor in your application. As for why isn't a single binding enough? A single binding is enough - either one 😉.
Personally, I like the pure JavaScript bindings because it's JavaScript all the way down, but they both work equally and have full feature parity due to the extensive overlapping test suite.
Nothing. You are responsible for calling either client.query('COMMIT')
or client.query('ROLLBACK')
. If you don't, the client will be returned to the pool with an open transaction, and I assume bad things will happen in your application.
Problem: npm install pg-native
fails with error message “Call to 'pg_config --libdir' returned exit status 1. while trying to load binding.gyp”
You need PostgreSQL installed on your system. The path to PostgreSQL bin directory must be included in the environment PATH variable. pg_config
is stored in that bin directory.
Quick fix for PowerShell:
$env:PATH+=";C:\Program Files\PostgreSQL\9.2\bin"
npm install pg