this project use swagger Documentation go to : http://petstore.swagger.io/
paste : http://api.dataretriever.hupi.io/swagger_doc
- Content-Type: "application/json"
- Accept-Version: "v1"
- X-API-Token: "your_token"
only hdr_endpoint with api
fields to true are accessible with a non superadmin account.
-
module_name: (String)
-
method_name: (String)
-
body:
{ "client": client_name, "render_type": render_type, "filters": { filter_name1: { "operator": operator, "value": value}, filter_name2: value }, "query_params": { "something": 42 } }
-
client: (String) client name
-
render_type: (String) depends of the configuration of the hdr_endpoint
-
query_params: (JSON)
-
filters: (JSON) two acceptable way
{ "filter_name": { "operator": "<", "value": "42" }, "filter_name": "42" } }
-
-
module_name: (String)
-
method_name: (String)
-
query_object_name: (String)
-
body:
{ "client": client_name, "render_type": render_type, "filters": { filter_name1: { "operator": operator, "value": value}, filter_name2: value }, "query_params": { "something": 42 } }
-
client: (String) client name
-
render_type: (String) depends of the configuration of the hdr_endpoint
-
query_params: (JSON)
-
filters: (JSON) two acceptable way
{ "filter_name": { "operator": "<", "value": "42" }, "filter_name": "42" } }
-
-
subject: (String)
-
body:
{ "client": client_name, subject: { subject_features } }
- client: (String)
- (subject): (JSON)
- module_name: (String)
- method_name: (String)
admin endpoints follows REST convention.
- index: GET
http://api.dataretriever.hupi.io/admin/(:model_name)s
- create: POST
http://api.dataretriever.hupi.io/admin/(:model_name)
- read: GET
http://api.dataretriever.hupi.io/admin/(:model_name)/(:id)
- update: PUT
http://api.dataretriever.hupi.io/admin/(:model_name)/(:id)
- delete: DELETE
http://api.dataretriever.hupi.io/admin/(:model)/(:id)
+---------------------+ (1,n) (1,0) +---------------------+
| hdr_account |<--------------| hdr_query_engine |
+---------------------+ +---------------------+
|*id(uuid) | |*id(Int) |
|*name(String) | |*name(String) |
|*access_token(String)| |*desc(String) |
|*role(String) | |*engine(String) |
|*created_at(DateTime)| |*settings(JSON) |
|*updated_at(DateTime)| |*created_at(DateTime)|
+---------------------+ |*updated_at(DateTime)|
^ +---------------------+
|(1,n) ^
| |(1,n)
| |
| |
|(1,0) |(1,1)
| |
+---------+-----------+ (1,n) (1,1) +----------+----------+ (1,n) (1,n) +---------------------+
| hdr_endpoint |-------------->| hdr_query_object |-------------->| hdr_export_type |
+---------------------+ +---------------------+ +---------------------+
|*id(Int) | |*id(Int) | |*id(Int) |
|*module_name(String) | |*name(String) | |*name(String) |
|*method_name(String) | |*desc(String) | |*desc(String) |
|*api(Boolean) | |*query(String) | |*render_types(Array) |
|*created_at(DateTime)| |*created_at(DateTime)| |*created_at(DateTime)|
|*updated_at(DateTime)| |*updated_at(DateTime)| |*updated_at(DateTime)|
+---------------------+ +----------+----------+ +---------------------+
|
|(1,n)
|
|
|(1,1)
V
+-------------------------+
| hdr_filters |
+-------------------------+
|*id(Int) |
|*pattern(String) |
|*value_type(String) |
|*field (String) |
|*default_operator(String)|
|*created_at(DateTime) |
|*updated_at(DateTime) |
+-------------------------+
you can create or update hdr_endpoint with nested attributes using the rails way. If you add "id" key this will update the corresponding object.
{
"hdr_endpoint": {
"module_name": "my_module",
"method_name": "my_name",
"hdr_query_objects_attributes": {
"1449582731413": {
"hdr_query_engine_id": "1",
"hdr_export_type_ids": ["3", "4", "5"],
"query": "my query",
"hdr_filters_attributes": {
"1449582746742": {
"pattern": "my_pattern",
"value_type": "my_type",
"field": "my_field",
"default_operator": "my_operator",
"name": "my_filter_name"
},
"1449582746792": {
"pattern": "my_pattern",
"value_type": "my_type",
"field": "my_field",
"default_operator": "my_operator",
"name": "my_filter_name"
}
}
}
}
}
}
in order to have a result formated for Highchart, query should rename field according to the desired render_type. Below we list the different renaming convention and the associated render type.
If you want to better understand what each field does take a look in lib/export/
and spec/lib/export
To add filter in your query simply insert a pattern #_your_pattern_#
at the desired place in the query, and add the corresponding hdr_filter. the pattern will be replaced by the filter given when you call the hdr_endpoint or if there is no filter given it will be replaced by an empty string.
- Filter types supported are int, string and array.
- Comparison operators can be used int type filters.
- To compare dates, set the type as 'string' and compare. Date as an explicit type is not supported
- [mysql](#SQL Query)
- [postgresql](#SQL Query)
- [impala](SQL Query)
- [mongodb](MongoDB Query)
- [elasticsearch](#ElasticSearch Query)
- openscoring
- http
write a valid sql query, you can use every function that your query_engine recognize.
filter recognize specific pattern:
-
#_and_[something_else]_#
: AddAND
at the beginning andAND
between each filter that match the patternselect fruit as category, color as serie, quantity as value from fruit_table where color="red" #_and_f1_#
-
#_where_[something_else]_#
: AddWHERE
at the beginning andAND
between each filter that match the patternselect fruit as category, color as serie, quantity as value from fruit_table #_where_f1_#
-
#_limit_[something_else]_#
and#_offset_[something_else]_#
: Addlimit
filter andoffset
filter.Example filter value is:- { operator: "=", value: "10", field: "limit", value_type: "int" } It will replace it by
select fruit as category, color as serie, quantity as value from fruit_table limit 10;
since MongoDB only provide api to query their database, we use a JSON query that we match with their api.
{
"collection": "colleciton_name",
"query": [ operator ]
}
-
aggregate:
db.my_colleciton.aggregate(mongo_aggregate_query, mongo_options)
{ "operator": "aggregate", "pipeline": mongo_aggregate_query, "opts": mongo_options }
-
find:
db.my_collection.find(mongo_find_query, mongo_options)
{ "operator": "find", "find": mongo_find_query, "opts": mongo_options }
-
map_reduce:
db.my_collection.map_reduce(mongo_map, mongo_reduce, mongo_options)
{ "operator": "map_reduce", "map": mongo_map, "reduce": mongo_reduce, "opts": mongo_options }
-
#_match_[something_else]_#
: add a match{ $match: { $and: [ filters_here ] } },
, join filters with,
#_limit_[something_else]_#
: Limit number of results. Limit filter values example: -{ operator: "$limit", value: "100", field: "total", value_type: "int" }
{ "collection": "metrics", "query": [ { "operator": "aggregate", "pipeline":[ #_match_f1_# { "$group": { "_id": { "year": "$year", "month": "$month"}, "value": { "$sum": "$sessions"} } }, { "$project": { "_id": 0, "category": "$_id.month", "serie": "$_id.year", "value": "$value" } }, { "$sort": { "category": 1, "serie": 1 } }, #_limit_f1_# ] } ] }
-
#_find_[something_else]_#
:$and: [ filters_here ]
, join filters with,
{ "collection": "metrics", "query": [ { "operator": "find", "filter": {#_find_f1_#}, "opts": { "$sort": { "year": 1, "month": 1, "day": 1 } } } ] }
-
#_and_[something_else]_#
: start with,
and join filters with,
{ "collection": "metrics", "query": [ { "operator": "find", "filter": {"$and": [{ "year": 2014 } #_and_f1_#]}, "opts": { "$sort": { "year": 1, "month": 1, "day": 1 } } } ] }
-
#_replace_field_filter_[something_else]_#
: start with,
and join filters with,
To limit number of documents use replace_field filter and field as $limit. And for offset, use also replace_field filter where we skip a specified number of documents, pass field as $skip. And operator should always be $eq for replace field.{ "collection": "metrics", "query": [ { "operator": "find", "filter": {"$and": [{ "year": 2014 }]}, "opts": { "$sort": { "year": 1, "month": 1, "day": 1 } } #_replace_field_filter_f3_# #_replace_field_filter_f2_# } ] }
replace_field example values:-
{ operator: "$eq", value: "100", field: "$skip", value_type: "int" },
{ operator: "$eq", value: "100", field: "$limit", value_type: "int" }
With the following values of filters, replace_field_filter_f3
will be replaced by {"$limit": 100} and replace_field_filter_f3
by {"$skip": 100}
write a regular ElasticSearch POST query. See ElasticSearch documentation for available operator.
the result returned are the one in [hits][_source]
key.
{
"index": "index_name",
"body": {
"query" : {
your_query
}
}
}
#_sub_[something_else]_#
: join operator filters with,
and replace#_value_#
in the operator by the value provided
{
"index": "index_name",
"body": {
"query" : {
"bool": {
"should": [
#_sub_f1_#
]
}
}
}
}
#_sub_and_[something_else]_#
: join operator filters with,
and replace#_value_#
in the operator by the value provided. Add,
at the and of filter
{
"index": "index_name",
"body": {
"query" : {
"bool": {
"should": [
#_sub_and_f1_#
{"match":{"my_field":{"query":"my_value"}}}
]
}
}
}
}
HdrFilter:
{
"pattern": "sub_f1",
"value_type": "string",
"field": "my_field",
"default_operator": "{\"match\":{\"my_field\":{\"query\":#_value_#,\"boost\":1}}}",
"name": "my_filter"
}
{
"pattern": "sub_f1",
"value_type": "string",
"field": "my_field2",
"default_operator": "{\"match\":{\"my_field2\":{\"query\":#_value_#,\"boost\":1}}}",
"name": "my_filter2"
}
Filter:
{
"my_filter": "my_value",
"my_filter2": "my_value2"
}
Query:
{
"index": "index_name",
"body": {
"query" : {
"bool": {
"should": [
#_sub_f1_#
]
}
}
}
}
Query Generated:
{
"index": "index_name",
"body": {
"query" : {
"bool": {
"should": [
{"match":{"my_field":{"query":"my_value","boost":1}}},
{"match":{"my_field2":{"query":"my_value2","boost":1}}}
]
}
}
}
}
To have request timeout limit, use this option in the query engine.
transport_options: {request: {timeout: 1}}
Place your PMML file in the query field. HdrFilter doesn't apply for this query engine. To make prediction, you should provide your features as JSON in the query_params
key.
To make query, you should provide your input as JSON in the query_params
key. Basically this is used to predict results and query params is mandatory.
Example query:-
{
"client": "client_name",
"render_type": "cursor",
"query_params": "{\"input_text\":\"blah blah\"}"
}
And to use filters, filters replace the value you dynamically want to send. So filters won't have a field name But you can give a dummy field name, likewise to default operator as well. field_name and default_operator doesn't apply for http query engine filters. Hence replace with them some value Example query_object filter:-
{
"input":{"visitor_id": "#_replace_f1_#"}
}
And filter will be with these values
{pattern: 'replace_f1', field_type: 'String', field: 'Anyfieldname', defaultOperator: 'doesnot apply', filter_name: 'vid'}
To make a query to this endpoint with the query_object and filter values, your query body should be like this
{
"client":"hupi",
"render_type":"cursor",
"filters":"{\"vid\":\"45edcb88030b8da9\"}"
}
We also developped a specific case in the http_query_engine to be able to link our Data-Retriever to an Apache Kylin instance in order to fetch data from olap cubes.
To use this specific formatting, set the http_call
setting of the query engine to PostKylin
and don't forget to add, also in the settings, the http_header
:
{
...
"http_header":"Authorization: Basic <base64 encoded 'user:pwd'> "
}
For more details on the connection to Kylin using APIs, checkout: Kylin Documentation
Note: When creating an endpoint with this Kylin query engine, the query field should look like:
{ "sql":"select * from aTable", "project":"Kylin Project Name" }
all render type output are JSON. All output presented below are in the data
key of the response body. When there is an error, the response body contain an error
key that contain an explanation of the error and the response code is not 201.
- category serie value
- fixed placement column
- serie value
- column stacked grouped
- boxplot
- small heatmap
- large heatmap
- scatter
- bubble
- leaflet
- CSV
- multiple CSV
- JSON value
- JSON array
- value
- cursor
-
render type: category_serie_value, column_stacked_normal, column_stacked_percent, basic_line, basic_area, stacked_area, stacked_area_percent, multiple_column, windrose, spiderweb, column_stacked
-
required field in query result: category, serie, value
-
input:
[ { "category": "cat1", "serie": "ser1", "value": 10 }, { "category": "cat1", "serie": "ser2", "value": 9 }, { "category": "cat1", "serie": "ser3", "value": 6 }, { "category": "cat2", "serie": "ser1", "value": 8 }, { "category": "cat2", "serie": "ser2", "value": 7 } ]
-
output:
{ "series": [ { "name": "ser1", "data": [10, 8] }, { "name": "ser2", "data": [9, 7] }, { "name": "ser3", "data": [6, 0] } ], "categories": ["cat1", "cat2"] }
-
render type: fixed_placement_column
-
required field in query result: category, serie, value
-
input:
[ { "category": "cat1", "serie": "ser1", "value": 10 }, { "category": "cat1", "serie": "ser2", "value": 9 }, { "category": "cat1", "serie": "ser3", "value": 8 }, { "category": "cat1", "serie": "ser4", "value": 7 }, { "category": "cat2", "serie": "ser1", "value": 1 }, { "category": "cat2", "serie": "ser2", "value": 2 }, { "category": "cat2", "serie": "ser3", "value": 3 }, { "category": "cat2", "serie": "ser4", "value": 4 } ]
-
output:
{ "categories": ["cat1", "cat2"], "series": [ { "name": "ser1", "data": [10, 1], "color": "rgba(165,170,217,1)", "pointPlacement": -0.2, "pointPadding": 0.3 }, { "name": "ser2", "data": [9, 2], "color": "rgba(248,161,63,1)", "pointPlacement": 0.2, "pointPadding": 0.3 }, { "name": "ser3", "data": [8, 3], "color": "rgba(126,86,134,.9)", "pointPlacement": -0.2, "pointPadding": 0.4, "yAxis": 1 }, { "name": "ser4", "data": [7, 4], "color": "rgba(186,60,61,.9)", "pointPlacement": 0.2, "pointPadding": 0.4, "yAxis": 1 } ] }
-
render type: serie_value, pie_chart, half_donuts, funnel, column, mult_iple_infobox
-
required field in query result: serie, value
-
input:
[ { "serie": "ser1", "value": 10 }, { "serie": "ser2", "value": 9 }, { "serie": "ser3", "value": 6 }, { "serie": "ser1", "value": 8 }, { "serie": "ser2", "value": 7 } ]
-
output:
{ "series": [ ["ser1", 10], ["ser2", 9], ["ser3", 6], ["ser1", 8], ["ser2", 7] ] }
-
render type: column_stacked_grouped
-
required field in query result: category, serie, value, stack
-
input:
[ { "category": "Apples", "serie": "John", "stack": "male", "value": 5 }, { "category": "Oranges", "serie": "John", "stack": "male", "value": 3 }, { "category": "Pears", "serie": "John", "stack": "male", "value": 4 }, { "category": "Grapes", "serie": "John", "stack": "male", "value": 7 }, { "category": "Bananas", "serie": "John", "stack": "male", "value": 2 }, { "category": "Apples", "serie": "Joe", "stack": "male", "value": 3 }, { "category": "Oranges", "serie": "Joe", "stack": "male", "value": 4 }, { "category": "Pears", "serie": "Joe", "stack": "male", "value": 4 }, { "category": "Grapes", "serie": "Joe", "stack": "male", "value": 2 }, { "category": "Apples", "serie": "Jane", "stack": "female", "value": 2 }, { "category": "Oranges", "serie": "Jane", "stack": "female", "value": 5 }, { "category": "Pears", "serie": "Jane", "stack": "female", "value": 6 }, { "category": "Grapes", "serie": "Jane", "stack": "female", "value": 2 }, { "category": "Bananas", "serie": "Jane", "stack": "female", "value": 1 }, { "category": "Apples", "serie": "Janet", "stack": "female", "value": 3 }, { "category": "Pears", "serie": "Janet", "stack": "female", "value": 4 }, { "category": "Grapes", "serie": "Janet", "stack": "female", "value": 4 }, { "category": "Bananas", "serie": "Janet", "stack": "female", "value": 3 } ]
-
output:
{ "series": [ { "name": "Jane", "stack": "female", "data": [2, 5, 6, 2, 1] }, { "name": "Janet", "stack": "female", "data": [3, 0, 4, 4, 3] }, { "name": "Joe", "stack": "male", "data": [3, 4, 4, 2, 0] }, { "name": "John", "stack": "male", "data": [5, 3, 4, 7, 2] } ], "categories": ["Apples", "Oranges", "Pears", "Grapes", "Bananas"] }
-
render type: boxplot
-
required field in query result: category, serie, min, first_quartil, median, third_quartil, max
-
input:
[ { "category": "cat1", "serie": "observation1", "min": 1, "first_quartil": 6, "median": 12, "third_quartil": 31, "max": 45 }, { "category": "cat2", "serie": "observation1", "min": 2, "first_quartil": 7, "median": 22, "third_quartil": 32, "max": 46 }, { "category": "cat1", "serie": "observation2", "min": 3, "first_quartil": 8, "median": 32, "third_quartil": 33, "max": 47 }, { "category": "cat2", "serie": "observation2", "min": 4, "first_quartil": 9, "median": 42, "third_quartil": 34, "max": 48 }, ]
-
output:
{ "categories": ["cat1", "cat2"], "series": [ { "name": "observation1", "data": [ [1, 6, 12, 31, 45], [2, 7, 22, 32, 46] ] }, { "name": "observation2", "data": [ [3, 8, 32, 33, 47], [4, 9, 42, 34, 48] ] } ] }
-
render type: small_heatmap
-
required field in query result: x_category, y_category, value
-
input:
[ { "x_category": "cat1", "y_category": "ser1", "value": 1 }, { "x_category": "cat1", "y_category": "ser2", "value": 9 }, { "x_category": "cat1", "y_category": "ser3", "value": 6 }, { "x_category": "cat2", "y_category": "ser1", "value": 8 }, { "x_category": "cat2", "y_category": "ser2", "value": 7 } ]
-
output:
{ "x_category": ["cat1", "cat2"], "y_category": ["ser1", "ser2", "ser3"], "data": [ [0, 0, 1], [0, 1, 9], [0, 2, 6], [1, 0, 8], [1, 1, 7], [1, 2, 0] ] }
-
render type: large_heatmap
-
required field in query result: x_category, y_category, value
-
input:
[ { "x_category": "cat1", "y_category": "ser1", "value": 1 }, { "x_category": "cat1", "y_category": "ser2", "value": 9 }, { "x_category": "cat1", "y_category": "ser3", "value": 6 }, { "x_category": "cat2", "y_category": "ser1", "value": 8 }, { "x_category": "cat2", "y_category": "ser2", "value": 7 } ]
-
output:
{ "x_category": ["cat1", "cat2"], "y_category": ["ser1", "ser2", "ser3"], "data": "0,0,1\n0,1,9\n0,2,6\n1,0,8\n1,1,7\n1,2,0\n" }
-
render type: scatter
-
required field in query result: serie, x ,y
-
input:
[ { "serie": "ser1", "x": 1, "y": 1, "value": 1 }, { "serie": "ser1", "x": 1, "y": 2, "value": 9 }, { "serie": "ser2", "x": 1, "y": 3, "value": 6 }, { "serie": "ser2", "x": 2, "y": 1, "value": 8 }, { "serie": "ser1", "x": 2, "y": 2, "value": 7 } ]
-
output:
{ "series": [ { "name": "ser1", "color": "hsla(102, 70%, 50%, 0.5)", "data": [[1, 1], [1, 2], [2, 2]] }, { "name": "ser2", "color": "hsla(348, 70%, 50%, 0.5)", "data": [[1, 3], [2, 1]] } ] }
-
render type: scatter
-
required field in query result: serie, x ,y
-
input:
[ { "serie": "ser1", "x": 1, "y": 1, "z": 1, "name": "n1" }, { "serie": "ser1", "x": 1, "y": 2, "z": 9, "name": "n2" }, { "serie": "ser2", "x": 1, "y": 3, "z": 6, "name": "n3" }, { "serie": "ser2", "x": 2, "y": 1, "z": 8, "name": "n4" }, { "serie": "ser1", "x": 2, "y": 2, "z": 7, "name": "n5" } ]
-
output:
{ "series": [ { "name": "ser1", "data": [ { "x": 1, "y": 1, "z": 1, "name": "n1" }, { "x": 1, "y": 2, "z": 9, "name": "n2" }, { "x": 2, "y": 2, "z": 7, "name": "n5" } ] }, { "name": "ser2", "data": [ { "x": 1, "y": 3, "z": 6, "name": "n3" }, { "x": 2, "y": 1, "z": 8, "name": "n4" } ] } ] }
-
render type: leaflet
-
required field in query result: layer_name, collection, type, geometry_type, lat, lng, any additional field would be included in the "properties". For Polygon type, the field coordinates is needed do not use lat,lng
-
input:
[ { "collection": "features", "layer_name": "calque", "type": "Feature", "geometry_type": "Point", "lat": 10.00, "lng": 0.0, "data1": "ab", "data2": "bc" }, { "collection": "features", "layer_name": "calque", "type": "Feature", "geometry_type": "Point", "lat": 20.00, "lng": 1.0, "data1": "ac", "data2": "bd" }, { "collection": "features", "layer_name": "calque", "type": "Feature", "geometry_type": "Point", "lat": 30.00, "lng": 2.0, "data1": "ad", "data2": "be" } ]
-
output:
{ "calque": { "type": "FeatureCollection", "features": [ { "type": "Feature", "geometry": { "type": "Point", "coordinates": [0.0, 10.0] }, "properties": { "data1": "ab", "data2": "bc" } }, { "type": "Feature", "geometry": { "type": "Point", "coordinates": [1.0, 20.0] }, "properties": { "data1": "ac", "data2": "bd" } }, { "type": "Feature", "geometry": { "type": "Point", "coordinates": [2.0, 30.0] }, "properties": { "data1": "ad", "data2": "be" } } ] } }
-
render type: csv
-
required field in query result: none
-
input:
[ { "category": "cat1", "serie": "ser1", "value": 10, "datestamp": 20141001 }, { "category": "cat1", "serie": "ser2", "value": 9, "datestamp": 20141002 }, { "category": "cat1", "serie": "ser3", "value": 6, "datestamp": 20141003 }, { "category": "cat2", "serie": "ser1", "value": 8, "datestamp": 20141004 }, { "category": "cat2", "serie": "ser2", "value": 7, "datestamp": 20141005 } ]
-
output:
{ "rows": [ ["cat1", "ser1", 10, 20141001], ["cat1", "ser2", 9, 20141002], ["cat1", "ser3", 6, 20141003], ["cat2", "ser1", 8, 20141004], ["cat2", "ser2", 7, 20141005] ], "header": ["category", "serie", "value", "datestamp"] }
-
render type: multiple_csv
-
required field in query result: none
-
input:
[ { "category": "cat1", "serie": "ser1", "value": 10, "datestamp": 20141001 }, { "category": "cat1", "serie": "ser2", "value": 9, "datestamp": 20141002 }, { "category": "cat1", "serie": "ser3", "value": 6, "datestamp": 20141003 }, { "category": "cat2", "serie": "ser1", "value": 8, "datestamp": 20141004 }, { "category": "cat2", "serie": "ser2", "value": 7, "datestamp": 20141005 } ]
-
output:
{ "header": ["serie", "value", "datestamp"], "rows": { "cat1": [ ["ser1", 10, 20141001], ["ser2", 9, 20141002], ["ser3", 6, 20141003] ], "cat2": [ ["ser1", 8, 20141004], ["ser2", 7, 20141005] ] } }
-
render type: treemap2, treemap3, json_value
-
required field in query result: none
-
description: indent according to the order of the key
-
input:
[ { "category": "cat1", "serie": "ser1", "ser": "s", "value": 10 }, { "category": "cat1", "serie": "ser1", "ser": "s", "value": 4 }, { "category": "cat1", "serie": "ser2", "ser": "s", "value": 9 }, { "category": "cat1", "serie": "ser3", "ser": "s", "value": 6 }, { "category": "cat2", "serie": "ser1", "ser": "s", "value": 8 }, { "category": "cat2", "serie": "ser2", "ser": "s1", "value": 7 }, { "category": "cat2", "serie": "ser2", "ser": "s2", "value": 5 } ]
-
output:
{ "cat1": { "ser1": { "s": 4 }, "ser2": { "s": 9 }, "ser3": { "s": 6 } }, "cat2": { "ser1": { "s": 8 }, "ser2": { "s1": 7, "s2": 5 } } }
-
render type: json_array
-
required field in query result: none
-
description: indent according to the order of the key
-
input:
[ { "category": "cat1", "serie": "ser1", "ser": "s", "value": 10 }, { "category": "cat1", "serie": "ser1", "ser": "s", "value": 4 }, { "category": "cat1", "serie": "ser2", "ser": "s", "value": 9 }, { "category": "cat1", "serie": "ser3", "ser": "s", "value": 6 }, { "category": "cat2", "serie": "ser1", "ser": "s", "value": 8 }, { "category": "cat2", "serie": "ser2", "ser": "s1", "value": 7 }, { "category": "cat2", "serie": "ser2", "ser": "s2", "value": 5 } ]
-
output:
{ "cat1": { "ser1": { "s": [10, 4] }, "ser2": { "s": [9] }, "ser3": { "s": [6] } }, "cat2": { "ser1": { "s": [8] }, "ser2": { "s1": [7], "s2": [5] } } }
-
render type: value, infobox
-
required field in query result: none
-
description: return the last value of the last key
-
input:
[ { "my_field1": 14, "my_field2": 15 }, { "my_field1": 41, "my_field2": 42 } ]
-
output:
{ "value": 42 }
-
render type: cursor
-
required field in query result: none
-
description: send result of the query without modification
-
input:
[ { "field1": "cat1", "field2": "ser1", "field3": "s", "field4": 10 }, { "field1": "cat1", "field2": "ser1", "field3": "s", "field4": 4 }, { "field1": "cat1", "field2": "ser2", "field3": "s", "field4": 9 }, { "field1": "cat1", "field2": "ser3", "field3": "s", "field4": 6 }, { "field1": "cat2", "field2": "ser1", "field3": "s", "field4": 8 }, { "field1": "cat2", "field2": "ser2", "field3": "s1", "field4": 7 }, { "field1": "cat2", "field2": "ser2", "field3": "s2", "field4": 5 } ]
-
output:
[ { "field1": "cat1", "field2": "ser1", "field3": "s", "field4": 10 }, { "field1": "cat1", "field2": "ser1", "field3": "s", "field4": 4 }, { "field1": "cat1", "field2": "ser2", "field3": "s", "field4": 9 }, { "field1": "cat1", "field2": "ser3", "field3": "s", "field4": 6 }, { "field1": "cat2", "field2": "ser1", "field3": "s", "field4": 8 }, { "field1": "cat2", "field2": "ser2", "field3": "s1", "field4": 7 }, { "field1": "cat2", "field2": "ser2", "field3": "s2", "field4": 5 } ]
After checking out the repo, run bundle install
to install dependencies. Then, run rake db:reset
to create table and populate the database. (install before on PG : CREATE EXTENSION "uuid-ossp")
Then, run bundle exec rspec spec
to run the tests. You can also run rake console
for an interactive prompt that will allow you to experiment.
Bug reports and pull requests are welcome on GitHub at https://github.com/hupi-analytics/data-retriever. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the Contributor Covenant code of conduct.
The gem is available as open source under the terms of the MIT License.