diff --git a/docs/basics/internal.md b/docs/basics/internal.md index 9402ca5d3..51dc30cae 100644 --- a/docs/basics/internal.md +++ b/docs/basics/internal.md @@ -17,8 +17,8 @@ The operator `!` with a negative integer as left argument calls an internal func [-10!x](#-10x-type-enum) type enum -5! [parse](../ref/parse.md) [-11!](#-11-streaming-execute) streaming execute -6! [eval](../ref/eval.md) [-14!x](#-14x-quote-escape) quote escape -7! [hcount](../ref/hcount.md) -[-16!x](#-16x-ref-count) ref count -12! [.Q.host](../ref/dotq.md#host-hostname) -[-18!x](#-18x-compress-byte) compress byte -13! [.Q.addr](../ref/dotq.md#addr-ip-address) +[-16!x](#-16x-ref-count) ref count -12! [.Q.host](../ref/dotq.md#host-ip-to-hostname) +[-18!x](#-18x-compress-byte) compress byte -13! [.Q.addr](../ref/dotq.md#addr-iphost-as-int) [-21!x](#-21x-compressionencryption-stats) compression/encryption stats -15! [md5](../ref/md5.md) [-22!x](#-22x-uncompressed-length) uncompressed length -19! [set](../ref/get.md#set) [-23!x](#-23x-memory-map) memory map -20! [.Q.gc](../ref/dotq.md#gc-garbage-collect) @@ -359,7 +359,7 @@ q)-27!(3i;0 1+123456789.4567) "123456790.457" ``` -This is a more precise, built-in version of [`.Q.f`](../ref/dotq.md#f-format) but uses IEEE754 rounding: +This is a more precise, built-in version of [`.Q.f`](../ref/dotq.md#f-precision-format) but uses IEEE754 rounding: ```q q).045 diff --git a/docs/basics/ipc.md b/docs/basics/ipc.md index 2a9d80986..55cee040b 100644 --- a/docs/basics/ipc.md +++ b/docs/basics/ipc.md @@ -398,8 +398,8 @@ The compression/decompression algorithms are proprietary and implemented as the
[`.z` namespace](../ref/dotz.md) for callback functions
-[`.Q.addr`](../ref/dotq.md#addr-ip-address) (IP address), -[`.Q.host`](../ref/dotq.md#host-hostname) (hostname), +[`.Q.addr`](../ref/dotq.md#addr-iphost-as-int) (IP/host as int), +[`.Q.host`](../ref/dotq.md#host-ip-to-hostname) (IP to hostname),
:fontawesome-solid-book-open: [Connection handles](handles.md) diff --git a/docs/basics/syscmds.md b/docs/basics/syscmds.md index 6766c7c5a..c7a757c90 100644 --- a/docs/basics/syscmds.md +++ b/docs/basics/syscmds.md @@ -530,8 +530,8 @@ q)1%3 ``` :fontawesome-solid-book: -[`.Q.f`](../ref/dotq.md#f-format), -[`.Q.fmt`](../ref/dotq.md#fmt-format) +[`.Q.f`](../ref/dotq.md#f-precision-format) (precision format), +[`.Q.fmt`](../ref/dotq.md#fmt-precision-format) (precision format with length)
:fontawesome-solid-book-open: [Precision](precision.md), diff --git a/docs/cloud/aws-lambda/index.md b/docs/cloud/aws-lambda/index.md index 4459266b0..afa8403c2 100644 --- a/docs/cloud/aws-lambda/index.md +++ b/docs/cloud/aws-lambda/index.md @@ -335,7 +335,7 @@ From the response payload we see the function was successful and calculated the ## Stream data from Amazon S3 -To demonstrate a q/kdb+ Lambda function processing multiple events, we detail how to stream data from AWS Simple Storage Service (S3). Using FIFO named pipes and [`.Q.fps`](../../ref/dotq.md#fps-streaming-algorithm) within q, data can be streamed in for processing. To illustrate this example, we create 100 files each containing 1 million Black-Scholes input parameters. The files are placed in a S3 bucket. This S3 bucket is the trigger for the Lambda function. +To demonstrate a q/kdb+ Lambda function processing multiple events, we detail how to stream data from AWS Simple Storage Service (S3). Using FIFO named pipes and [`.Q.fps`](../../ref/dotq.md#fps-pipe-streaming) (pipe streaming) within q, data can be streamed in for processing. To illustrate this example, we create 100 files each containing 1 million Black-Scholes input parameters. The files are placed in a S3 bucket. This S3 bucket is the trigger for the Lambda function. :fontawesome-brands-aws: [Configuring Amazon S3 Event Notifications](https://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html) @@ -365,7 +365,7 @@ The steps in the `process_s3data.q` code are as follows. 1. Call S3 function from `Stream_Data` script, initiate FIFO pipe and stream in S3 data. 1. Load `blackScholes.q`. 1. Create inputs table to store input parameters. -1. Use [`.Q.fps`](../../ref/dotq.md#fps-streaming-algorithm) to stream in the S3 data from the FIFO `pipe_stream` to inputs table. +1. Use [`.Q.fps`](../../ref/dotq.md#fps-pipe-streaming) to stream in the S3 data from the FIFO `pipe_stream` to inputs table. 1. Use [`.Q.fu`](../../ref/dotq.md#fu-apply-unique) to run the inputs through the `blackScholes` formula. 1. `black_scholes_data` contains the input parameters and the calculated option prices. diff --git a/docs/github.md b/docs/github.md index e9cc40ccb..6c1927d08 100644 --- a/docs/github.md +++ b/docs/github.md @@ -117,6 +117,13 @@ GitHub topic queries:   +surv-cloud +Small market surveillance application for cloud/kubernetes. + (Luke Britton) + + + + tickrecover Recover from tickerplant crash. (Simon Garland) diff --git a/docs/interfaces/q-server-for-odbc3.md b/docs/interfaces/q-server-for-odbc3.md index a8c05a669..a92c5973b 100644 --- a/docs/interfaces/q-server-for-odbc3.md +++ b/docs/interfaces/q-server-for-odbc3.md @@ -70,6 +70,8 @@ Ensure you have `ps.k` loaded into the kdb+ process specified in your DSN: q)\l ps.k ``` +The kdb+ process should also be [listening on port](../basics/listening-port.md) which relates to the port choosen and defined in the odbc configuration. + ## Notes diff --git a/docs/kb/loading-from-large-files.md b/docs/kb/loading-from-large-files.md index 617185bce..8fa7dea96 100644 --- a/docs/kb/loading-from-large-files.md +++ b/docs/kb/loading-from-large-files.md @@ -13,7 +13,7 @@ The [Load CSV](../ref/file-text.md#load-csv) form of the File Text operator load If the data in the CSV file is too large to fit into memory, we need to break the large CSV file into manageable chunks and process them in sequence. -Function [`.Q.fs`](../ref/dotq.md#fs-streaming-algorithm) and its variants help automate this process. `.Q.fs` loops over a file in conveniently-sized chunks of complete records, and applies a function to each chunk. This lets you implement a _streaming algorithm_ to convert a large CSV file into an on-disk database without holding all the data in memory at once. +Function [`.Q.fs`](../ref/dotq.md#fs-file-streaming) (file streaming) and its variants help automate this process. `.Q.fs` loops over a file in conveniently-sized chunks of complete records, and applies a function to each chunk. This lets you implement a _streaming algorithm_ to convert a large CSV file into an on-disk database without holding all the data in memory at once. ## Using `.Q.fs` @@ -96,11 +96,11 @@ date open high low close volume sym Variants of `.Q.fs` extend it to [named pipes](named-pipes.md) and control chunk size. :fontawesome-solid-book: -[`.Q.fsn`](../ref/dotq.md#fsn-streaming-algorithm) for chunk size +[`.Q.fsn`](../ref/dotq.md#fsn-file-streaming) for chunk size
:fontawesome-solid-book: -[`.Q.fps`](../ref/dotq.md#fps-streaming-algorithm), -[`.Q.fpn`](../ref/dotq.md#fpn-streaming-algorithm) for named pipes +[`.Q.fps`](../ref/dotq.md#fps-pipe-streaming), +[`.Q.fpn`](../ref/dotq.md#fpn-pipe-streaming) for named pipes - -*[AWS]: Amazon Web Services -*[GCP]: Google Cloud Platform -*[RDB]: Realtime Database -*[RTE]: Realtime Engine -*[TP]: tickerplant -*[UI]: user interface diff --git a/mkdocs.yml b/mkdocs.yml index 99e342411..bf4eff956 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -179,7 +179,7 @@ nav: - Scrabble: learn/reading/scrabble.md - Application examples: - Astronomy: wp/astronomy.md - - Card counters: wp/card-counters/index.md + - Detecting card counters: wp/card-counters/index.md - Corporate actions: wp/corporate-actions.md - Disaster management: wp/disaster-management/index.md - Exoplanets: wp/exoplanets/index.md @@ -566,7 +566,7 @@ nav: - ODBC3: interfaces/q-server-for-odbc3.md - ODBC3 and Tableau: wp/data-visualization/index.md # - ODBC/Simba: interfaces/odbc-simba.md - - Pub/sub with Solace: wp/solace/index.md + - Solace pub/sub: wp/solace/index.md - Open source: github.md - Machine learning: ml.md - Using kdb+ in the cloud: @@ -595,7 +595,6 @@ nav: - Amazon Web Services: cloud/autoscale/aws.md - Realtime data cluster: cloud/autoscale/rdc.md - Costs and risks: cloud/autoscale/cost-risk.md - - Surveillance in the Cloud: wp/surv-cloud/index.md - Other file systems: - MapR-FS: cloud/otherfs/mapr.md - Goofys: cloud/otherfs/goofys.md