node.js - Rethinkdb audit trail with field timestamp and status when update/insert -


i trying create rethinkdb database audit trail in node.js.

my table contains following fields:

order-id quantity price status timestamp

i have file upload following data example:

order-id......quantity.....price

1....................'1000'..........100

2....................'500'............100

3....................'1575'..........100

when upload in empty table these rows inserted along time&date , status set 'active'.

let's assume upload file again updated data (the quantity of order-id 1 changed), example:

order-id......quantity.....price

1....................'1010'..........100

2....................'500'............100

3....................'1575'..........100

the rethinkdb query should set status of row order-id 1 'replaced' , update timestamp. next should insert new row updated data, set status 'active' , timestamp it.

so table new updated data should this:

order-id......quantity.....price.....status.............timestamp

1....................'1000'..........100......'replaced'.....'12:03am 4/22/2016'

1....................'1010'..........100.......'active'........'12:03am 4/22/2016'

2....................'500'............100......'active'.........'12:00am 4/22/2016'

3....................'1575'..........100......'active'.........'12:00am 4/22/2016'

now have audit trail see , when changed/inserted.

is possible rethinkdb in node.js?

@dalanmiller

i tried test codes, don't understand prepend , 0th.

one of assumption made quantity can change. happens if price change or both of them @ once? possible add more fields/columns can updated on time.

it possible order changes couple times.

the data in variable called 'message'.

this have , doesn't update:

r.db('testing').table('orders').filter({order_id: message[ticks].order_id}).count().run()   //count how many rows order_id       .then(function(feed){         if (json.stringify(feed)==1){                //check if order_id exist        r.db('testing').table('orders').filter({order_id: message[ticks].order_id}).update(message[ticks]).do(       function (post) {         return r.branch(post('replaced').eq(0),           false,      //if nothing changes, leave           r.db('testing').table('orders').post('orders').prepend(post('orders').nth(0).merge({message[ticks].quantity}))    // <-- don't understand how works         )        }).run()      }     else if (json.stringify(feed)==0){       //new order. insert order             message[ticks].timestamp = new date();    //add timestamp       message[ticks].status = 'active';         // add status 'active'       r.db('testing').table('orders').insert(message[ticks]).run()     } 

this possible node , rethinkdb.

one way have array of objects within document represents order itself. application, we're going assume object in 0th index of order_state array recent version of order.

simplistically, once order made this:

{    order_id: 1,   order_state: [ { quantity: 20, price: 100, timestamp: "12:03am 4/22/2016"} ] } 

and can done reql query:

r.db("marketplace").table("orders").insert({ above_obj }) 

now if person made order decides go , change order 40 instead of 20.

r.db("marketplace").table("orders").get( id of document ).update( (doc) => {      return {      orders: doc('orders').prepend(doc('orders').nth(0).merge({ quantity: 40}))    }  }) 

here first getting document .get() command .updateing it. return object key { orders: } orders field in original replaced. take original value of orders , .prepend() array copy of previous 0th element modify .merge have updated quantity of 40.

{    id: "a randomly generated uuid",    order_id: 1,    orders: [       {          quantity: 40, price: 100, timestamp: "12:35am 4/22/2016" // newest modification       },       {          quantity: 20, price: 100, timestamp: "12:03am 4/22/2016" // original order       }    ] }  

one drawback here if expecting thousands of modifications order, want put table , sort of join between two. if average order modifications stay relatively small, < 100 schema should fine you.


Comments

Popular posts from this blog

Django REST Framework perform_create: You cannot call `.save()` after accessing `serializer.data` -

Why does Go error when trying to marshal this JSON? -