Detalhes do pacote

pg-native

brianc464.2kMIT3.4.5

A slightly nicer interface to Postgres over node-libpq

postgres, pg, libpq

readme (leia-me)

node-pg-native

Build Status

High performance native bindings between node.js and PostgreSQL via libpq with a simple API.

install

You need PostgreSQL client libraries & tools installed. An easy way to check is to type pg_config. If pg_config is in your path, you should be good to go. If it's not in your path you'll need to consult operating specific instructions on how to go about getting it there.

Some ways I've done it in the past:

  • On macOS: brew install libpq
  • On Ubuntu/Debian: apt-get install libpq-dev g++ make
  • On RHEL/CentOS: yum install postgresql-devel
  • On Windows:
    1. Install Visual Studio C++ (successfully built with Express 2010). Express is free.
    2. Install PostgreSQL (http://www.postgresql.org/download/windows/)
    3. Add your Postgre Installation's bin folder to the system path (i.e. C:\Program Files\PostgreSQL\9.3\bin).
    4. Make sure that both libpq.dll and pg_config.exe are in that folder.

Afterwards pg_config should be in your path. Then...

$ npm i pg-native

use

async

var Client = require('pg-native')

var client = new Client();
client.connect(function(err) {
  if(err) throw err

  //text queries
  client.query('SELECT NOW() AS the_date', function(err, rows) {
    if(err) throw err

    console.log(rows[0].the_date) //Tue Sep 16 2014 23:42:39 GMT-0400 (EDT)

    //parameterized statements
    client.query('SELECT $1::text as twitter_handle', ['@briancarlson'], function(err, rows) {
      if(err) throw err

      console.log(rows[0].twitter_handle) //@briancarlson
    })

    //prepared statements
    client.prepare('get_twitter', 'SELECT $1::text as twitter_handle', 1, function(err) {
      if(err) throw err

      //execute the prepared, named statement
      client.execute('get_twitter', ['@briancarlson'], function(err, rows) {
        if(err) throw err

        console.log(rows[0].twitter_handle) //@briancarlson

        //execute the prepared, named statement again
        client.execute('get_twitter', ['@realcarrotfacts'], function(err, rows) {
          if(err) throw err

          console.log(rows[0].twitter_handle) //@realcarrotfacts

          client.end(function() {
            console.log('ended')
          })
        })
      })
    })
  })
})

sync

Because pg-native is bound to libpq it is able to provide sync operations for both connecting and queries. This is a bad idea in non-blocking systems like web servers, but is exteremly convienent in scripts and bootstrapping applications - much the same way fs.readFileSync comes in handy.

var Client = require('pg-native')

var client = new Client()
client.connectSync()

//text queries
var rows = client.querySync('SELECT NOW() AS the_date')
console.log(rows[0].the_date) //Tue Sep 16 2014 23:42:39 GMT-0400 (EDT)

//parameterized queries
var rows = client.querySync('SELECT $1::text as twitter_handle', ['@briancarlson'])
console.log(rows[0].twitter_handle) //@briancarlson

//prepared statements
client.prepareSync('get_twitter', 'SELECT $1::text as twitter_handle', 1)

var rows = client.executeSync('get_twitter', ['@briancarlson'])
console.log(rows[0].twitter_handle) //@briancarlson

var rows = client.executeSync('get_twitter', ['@realcarrotfacts'])
console.log(rows[0].twitter_handle) //@realcarrotfacts

api

constructor

  • constructor Client()

Constructs and returns a new Client instance

async functions

  • client.connect(<params:string>, callback:function(err:Error))

Connect to a PostgreSQL backend server.

params is optional and is in any format accepted by libpq. The connection string is passed as is to libpq, so any format supported by libpq will be supported here. Likewise, any format unsupported by libpq will not work. If no parameters are supplied libpq will use environment variables to connect.

Returns an Error to the callback if the connection was unsuccessful. callback is required.

example
var client = new Client()
client.connect(function(err) {
  if(err) throw err

  console.log('connected!')
})

var client2 = new Client()
client2.connect('postgresql://user:password@host:5432/database?param=value', function(err) {
  if(err) throw err

  console.log('connected with connection string!')
})
  • client.query(queryText:string, <values:string[]>, callback:Function(err:Error, rows:Object[]))

Execute a query with the text of queryText and optional parameters specified in the values array. All values are passed to the PostgreSQL backend server and executed as a parameterized statement. The callback is required and is called with an Error object in the event of a query error, otherwise it is passed an array of result objects. Each element in this array is a dictionary of results with keys for column names and their values as the values for those columns.

example
var client = new Client()
client.connect(function(err) {
  if (err) throw err

  client.query('SELECT NOW()', function(err, rows) {
    if (err) throw err

    console.log(rows) // [{ "now": "Tue Sep 16 2014 23:42:39 GMT-0400 (EDT)" }]

    client.query('SELECT $1::text as name', ['Brian'], function(err, rows) {
      if (err) throw err

      console.log(rows) // [{ "name": "Brian" }]

      client.end()
    })
  })
})
  • client.prepare(statementName:string, queryText:string, nParams:int, callback:Function(err:Error))

Prepares a named statement for later execution. You must supply the name of the statement via statementName, the command to prepare via queryText and the number of parameters in queryText via nParams. Calls the callback with an Error if there was an error.

example
var client = new Client()
client.connect(function(err) {
  if(err) throw err

  client.prepare('prepared_statement', 'SELECT $1::text as name', 1, function(err) {
    if(err) throw err

    console.log('statement prepared')
    client.end()
  })

})
  • client.execute(statementName:string, <values:string[]>, callback:Function(err:err, rows:Object[]))

Executes a previously prepared statement on this client with the name of statementName, passing it the optional array of query parameters as a values array. The callback is mandatory and is called with and Error if the execution failed, or with the same array of results as would be passed to the callback of a client.query result.

example
var client = new Client()
client.connect(function(err) {
  if(err) throw err

  client.prepare('i_like_beans', 'SELECT $1::text as beans', 1, function(err) {
    if(err) throw err

    client.execute('i_like_beans', ['Brak'], function(err, rows) {
      if(err) throw err

      console.log(rows) // [{ "i_like_beans": "Brak" }]
      client.end()
    })
  })
})
  • client.end(<callback:Function()>

Ends the connection. Calls the optional callback when the connection is terminated.

example
var client = new Client()
client.connect(function(err) {
  if(err) throw err
  client.end(function() {
    console.log('client ended') // client ended
  })
})
  • client.cancel(callback:function(err))

Cancels the active query on the client. Callback receives an error if there was an error sending the cancel request.

example
var client = new Client()
client.connectSync()
//sleep for 100 seconds
client.query('select pg_sleep(100)', function(err) {
  console.log(err) // [Error: ERROR: canceling statement due to user request]
})
client.cancel(function(err) {
  console.log('cancel dispatched')
})

sync functions

  • client.connectSync(params:string)

Connect to a PostgreSQL backend server. Params is in any format accepted by libpq. Throws an Error if the connection was unsuccessful.

  • client.querySync(queryText:string, <values:string[]>) -> results:Object[]

Executes a query with a text of queryText and optional parameters as values. Uses a parameterized query if values are supplied. Throws an Error if the query fails, otherwise returns an array of results.

  • client.prepareSync(statementName:string, queryText:string, nParams:int)

Prepares a name statement with name of statementName and a query text of queryText. You must specify the number of params in the query with the nParams argument. Throws an Error if the statement is un-preparable, otherwise returns an array of results.

  • client.executeSync(statementName:string, <values:string[]>) -> results:Object[]

Executes a previously prepared statement on this client with the name of statementName, passing it the optional array of query paramters as a values array. Throws an Error if the execution fails, otherwas returns an array of results.

testing

$ npm test

To run the tests you need a PostgreSQL backend reachable by typing psql with no connection parameters in your terminal. The tests use environment variables to connect to the backend.

An example of supplying a specific host the tests:

$ PGHOST=blabla.mydatabasehost.com npm test

license

The MIT License (MIT)

Copyright (c) 2014 Brian M. Carlson

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

changelog (log de mudanças)

All major and minor releases are briefly explained below.

For richer information consult the commit log on github with referenced pull requests.

We do not include break-fix version release in this file.

pg@8.15.0

  • Add support for esm importing. CommonJS importing is still also supported.

pg@8.14.0

pg@8.13.0

pg@8.12.0

pg-pool@8.10.0

  • Emit release event when client is returned to the pool.

pg@8.9.0

pg@8.8.0

pg-pool@3.5.0

pg@8.7.0

  • Add optional config to pool to allow process to exit if pool is idle.

pg-cursor@2.7.0

pg@8.6.0

pg-query-stream@4.0.0

  • Library has been converted to Typescript. The behavior is identical, but there could be subtle breaking changes due to class names changing or other small inconsistencies introduced by the conversion.

pg@8.5.0

pg@8.4.0

  • Switch to optional peer dependencies & remove semver package which has been a small thorn in the side of a few users.
  • Export DatabaseError from pg-protocol.
  • Add support for sslmode in the connection string.

pg@8.3.0

pg@8.2.0

  • Switch internal protocol parser & serializer to pg-protocol. The change is backwards compatible but results in a significant performance improvement across the board, with some queries as much as 50% faster. This is the first work to land in an on-going performance improvement initiative I'm working on. Stay tuned as things are set to get much faster still! :rocket:

pg-cursor@2.2.0

  • Switch internal protocol parser & serializer to pg-protocol. The change is backwards compatible but results in a significant performance improvement across the board, with some queries as much as 50% faster.

pg-query-stream@3.1.0

  • Switch internal protocol parser & serializer to pg-protocol. The change is backwards compatible but results in a significant performance improvement across the board, with some queries as much as 50% faster.

pg@8.1.0

  • Switch to using monorepo version of pg-connection-string. This includes better support for SSL argument parsing from connection strings and ensures continuity of support.
  • Add &ssl=no-verify option to connection string and PGSSLMODE=no-verify environment variable support for the pure JS driver. This is equivalent of passing { ssl: { rejectUnauthorized: false } } to the client/pool constructor. The advantage of having support in connection strings and environment variables is it can be "externally" configured via environment variables and CLI arguments much more easily, and should remove the need to directly edit any application code for the SSL default changes in 8.0. This should make using pg@8.x significantly less difficult on environments like Heroku for example.

pg-pool@3.2.0

  • Same changes to pg impact pg-pool as they both use the same connection parameter and connection string parsing code for configuring SSL.

pg-pool@3.1.0

pg@8.0.0

note: for detailed release notes please check here

  • Remove versions of node older than 6 lts from the test matrix. pg>=8.0 may still work on older versions but it is no longer officially supported.
  • Change default behavior when not specifying rejectUnauthorized with the SSL connection parameters. Previously we defaulted to rejectUnauthorized: false when it was not specifically included. We now default to rejectUnauthorized: true. Manually specify { ssl: { rejectUnauthorized: false } } for old behavior.
  • Change default database when not specified to use the user config option if available. Previously process.env.USER was used.
  • Change pg.Pool and pg.Query to be an es6 class.
  • Make pg.native non enumerable.
  • notice messages are no longer instances of Error.
  • Passwords no longer show up when instances of clients or pools are logged.

pg@7.18.0

  • This will likely be the last minor release before pg@8.0.
  • This version contains a few bug fixes and adds a deprecation warning for a pending change in 8.0 which will flip the default behavior over SSL from rejectUnauthorized from false to true making things more secure in the general use case.

pg-query-stream@3.0.0

  • Rewrote stream internals to better conform to node stream semantics. This should make pg-query-stream much better at respecting highWaterMark and getting rid of some edge case bugs when using pg-query-stream as an async iterator. Due to the size and nature of this change (effectively a full re-write) it's safest to bump the semver major here, though almost all tests remain untouched and still passing, which brings us to a breaking change to the API....
  • Changed stream.close to stream.destroy which is the official way to terminate a readable stream. This is a breaking change if you rely on the stream.close method on pg-query-stream...though should be just a find/replace type operation to upgrade as the semantics remain very similar (not exactly the same, since internals are rewritten, but more in line with how streams are "supposed" to behave).
  • Unified the config.batchSize and config.highWaterMark to both do the same thing: control how many rows are buffered in memory. The ReadableStream will manage exactly how many rows are requested from the cursor at a time. This should give better out of the box performance and help with efficient async iteration.

pg@7.17.0

  • Add support for idle_in_transaction_session_timeout option.

7.16.0

  • Add optional, opt-in behavior to test new, faster query pipeline. This is experimental, and not documented yet. The pipeline changes will grow significantly after the 8.0 release.

7.15.0

7.14.0

7.13.0

7.12.0

7.11.0

7.10.0

7.9.0

7.8.0

7.7.0

7.6.0

7.5.0

7.4.0

7.3.0

7.2.0

  • Pinned pg-pool and pg-types to a tighter semver range. This is likely not a noticeable change for you unless you were specifically installing older versions of those libraries for some reason, but making it a minor bump here just in case it could cause any confusion.

7.1.0

Enhancements

7.0.0

Breaking Changes

  • Drop support for node < 4.x.
  • Remove pg.connect pg.end and pg.cancel singleton methods.
  • Client#connect(callback) now returns undefined. It used to return an event emitter.
  • Upgrade pg-pool to 2.x.
  • Upgrade pg-native to 2.x.
  • Standardize error message fields between JS and native driver. The only breaking changes were in the native driver as its field names were brought into alignment with the existing JS driver field names.
  • Result from multi-statement text queries such as SELECT 1; SELECT 2; are now returned as an array of results instead of a single result with 1 array containing rows from both queries.

Please see here for a migration guide

Enhancements

  • Overhauled documentation: https://node-postgres.com.
  • Add Client#connect() => Promise<void> and Client#end() => Promise<void> calls. Promises are now returned from all async methods on clients if and only if no callback was supplied to the method.
  • Add connectionTimeoutMillis to pg-pool.

v6.2.0

v6.1.0

  • Add optional callback parameter to the pure JavaScript client.end method. The native client already supported this.

v6.0.0

Breaking Changes

  • Remove pg.pools. There is still a reference kept to the pools created & tracked by pg.connect but it has been renamed, is considered private, and should not be used. Accessing this API directly was uncommon and was supposed to be private but was incorrectly documented on the wiki. Therefore, it is a breaking change of an (unintentionally) public interface to remove it by renaming it & making it private. Eventually pg.connect itself will be deprecated in favor of instantiating pools directly via new pg.Pool() so this property should become completely moot at some point. In the mean time...check out the new features...

New features

  • Replace internal pooling code with pg-pool. This is the first step in eventually deprecating and removing the singleton pg.connect. The pg-pool constructor is exported from node-postgres at require('pg').Pool. It provides a backwards compatible interface with pg.connect as well as a promise based interface & additional niceties.

You can now create an instance of a pool and don't have to rely on the pg singleton for anything:

var pg = require('pg')

var pool = new pg.Pool()

// your friendly neighborhood pool interface, without the singleton
pool.connect(function(err, client, done) {
  // ...
})

Promise support & other goodness lives now in pg-pool.

Please read the readme at pg-pool for the full api.

  • Included support for tcp keep alive. Enable it as follows:
var client = new Client({ keepAlive: true })

This should help with backends incorrectly considering idle clients to be dead and prematurely disconnecting them.

v5.1.0

  • Make the query object returned from client.query implement the promise interface. This is the first step towards promisifying more of the node-postgres api.

Example:

var client = new Client()
client.connect()
client.query('SELECT $1::text as name', ['brianc']).then(function (res) {
  console.log('hello from', res.rows[0])
  client.end()
})

v5.0.0

Breaking Changes

  • require('pg').native now returns null if the native bindings cannot be found; previously, this threw an exception.

New Features

  • better error message when passing undefined as a query parameter
  • support for defaults.connectionString
  • support for returnToHead being passed to generic pool

v4.5.0

  • Add option to parse JS date objects in query parameters as UTC

v4.4.0

  • Warn to stderr if a named query exceeds 63 characters which is the max length supported by postgres.

v4.3.0

  • Unpin pg-types semver. Allow it to float against pg-types@1.x.

v4.2.0

  • Support for additional error fields in postgres >= 9.3 if available.

v4.1.0

v4.0.0

  • Make native bindings an optional install with npm install pg-native
  • No longer surround query result callback with try/catch block.
  • Remove built in COPY IN / COPY OUT support - better implementations provided by pg-copy-streams and pg-native

v3.6.0

v3.5.0

  • Include support for parsing boolean arrays

v3.4.0

v3.2.0

v3.1.0

v3.0.0

Breaking changes

After some discussion it was decided node-postgres was non-compliant in how it was handling DATE results. They were being converted to UTC, but the PostgreSQL documentation specifies they should be returned in the client timezone. This is a breaking change, and if you use the date type you might want to examine your code and make sure nothing is impacted.

pg@v2.0 included changes to not convert large integers into their JavaScript number representation because of possibility for numeric precision loss. The same types in arrays were not taken into account. This fix applies the same type of type-coercion rules to arrays of those types, so there will be no more possible numeric loss on an array of very large int8s for example. This is a breaking change because now a return type from a query of int8[] will contain string representations of the integers. Use your favorite JavaScript bignum module to represent them without precision loss, or punch over the type converter to return the old style arrays again.

Single date parameters were properly sent to the PostgreSQL server properly in local time, but an input array of dates was being changed into utc dates. This is a violation of what PostgreSQL expects. Small breaking change, but none-the-less something you should check out if you are inserting an array of dates.

This is a small change to bring the semantics of query more in line with other EventEmitters. The tests all passed after this change, but I suppose it could still be a breaking change in certain use cases. If you are doing clever things with the end and error events of a query object you might want to check to make sure its still behaving normally, though it is most likely not an issue.

New features

The long & short of it is now any object you supply in the list of query values will be inspected for a .toPostgres method. If the method is present it will be called and its result used as the raw text value sent to PostgreSQL for that value. This allows the same type of custom type coercion on query parameters as was previously afforded to query result values.

If domains are active node-postgres will honor them and do everything it can to ensure all callbacks are properly fired in the active domain. If you have tried to use domains with node-postgres (or many other modules which pool long lived event emitters) you may have run into an issue where the active domain changes before and after a callback. This has been a longstanding footgun within node-postgres and I am happy to get it fixed.

Avoids a scenario where your pool could fill up with disconnected & unusable clients.

To provide better documentation and a clearer explanation of how to override the query result parsing system we broke the type converters into their own module. There is still work around removing the 'global-ness' of the type converters so each query or connection can return types differently, but this is a good first step and allow a lot more obvious way to return int8 results as JavaScript numbers, for example

v2.11.0

v2.10.0

v2.9.0

v2.8.0

  • Add support for parsing JSON[] and UUID[] result types

v2.7.0

  • Use single row mode in native bindings when available [@rpedela]
    • reduces memory consumption when handling row values in 'row' event
  • Automatically bind buffer type parameters as binary [@eugeneware]

v2.6.0

  • Respect PGSSLMODE environment variable

v2.5.0

  • Ability to opt-in to int8 parsing via pg.defaults.parseInt8 = true

v2.4.0

  • Use eval in the result set parser to increase performance

v2.3.0

  • Remove built-in support for binary Int64 parsing. Due to the low usage & required compiled dependency this will be pushed into a 3rd party add-on

v2.2.0

v2.1.0

v2.0.0

  • Properly handle various PostgreSQL to JavaScript type conversions to avoid data loss:
PostgreSQL | pg@v2.0 JavaScript | pg@v1.0 JavaScript
--------------------------------|----------------
float4     | number (float)     | string
float8     | number (float)     | string
int8       | string             | number (int)
numeric    | string             | number (float)
decimal    | string             | number (float)

For more information see https://github.com/brianc/node-postgres/pull/353 If you are unhappy with these changes you can always override the built in type parsing fairly easily.

v1.3.0

  • Make client_encoding configurable and optional

v1.2.0

  • return field metadata on result object: access via result.fields[i].name/dataTypeID

v1.1.0

  • built in support for JSON data type for PostgreSQL Server @ v9.2.0 or greater

v1.0.0

  • remove deprecated functionality
    • Callback function passed to pg.connect now requires 3 arguments
    • Client#pauseDrain() / Client#resumeDrain removed
    • numeric, decimal, and float data types no longer parsed into float before being returned. Will be returned from query results as String

v0.15.0

  • client now emits end when disconnected from back-end server
  • if client is disconnected in the middle of a query, query receives an error

v0.14.0

  • add deprecation warnings in prep for v1.0
  • fix read/write failures in native module under node v0.9.x