Skip to content
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Add initial code for QR ACS integration
  • Loading branch information
Daniel Falk committed Dec 16, 2025
commit 4926dcfe883a73681fd044fce2f665966cc4c336
26 changes: 26 additions & 0 deletions project-qr-code-decoder-axis-camera-station/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# AXIS Camera Station Integration for the FixedIT QR Code Decoder

This example uses the [FixedIT Data Agent](https://fixedit.ai/products-data-agent/) to consume QR or barcode detections from the FixedIT QR Code Decoder ACAP application and push them to the external data sources in the AXIS Camera Station VMS.

## How It Works

The FixedIT QR Code Decoder ACAP application has the capability to write each detected code as a JSON message to a Unix domain socket. We make use of this in the FixedIT Data Agent and uses the `inputs.socket_listener` plugin to read the messages from the socket.

TODO

### AXIS OS Compatibility

- **Minimum AXIS OS version**: Any version of AXIS OS that supports the FixedIT QR Code Decoder ACAP application and the FixedIT Data Agent.
- **Commands used**: TODO

### FixedIT Data Agent Compatibility

- **Minimum Data Agent version**: Any version.

## Quick Setup

TODO

## Configuration

TODO
Copy link

@coderabbitai coderabbitai bot Dec 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

README is incomplete and missing required template structure.

Per the coding guidelines, this README should follow the standardized template at .project_readme_template.md. The current state has several issues:

  1. Missing table of contents: Add <!-- toc --> and <!-- tocstop --> markers for markdown-toc generation.
  2. Missing product page link: The first mention of "FixedIT Data Agent" on line 3 should link to the product page.
  3. Multiple TODO placeholders: Lines 9, 14, 22, and 26 contain incomplete sections.
  4. Missing required sections: Troubleshooting, local testing/host testing limitations, implementation details, and a mermaid diagram explaining the system flow.

Consider updating line 3 to include the product page link properly:

-This example uses the [FixedIT Data Agent](https://fixedit.ai/products-data-agent/) to consume QR or barcode detections from the FixedIT QR Code Decoder ACAP application and push them to the external data sources in the AXIS Camera Station VMS.
+This example uses the [FixedIT Data Agent](https://fixedit.ai/products-data-agent/) to consume QR or barcode detections from the FixedIT QR Code Decoder ACAP application and push them to the external data sources in the AXIS Camera Station VMS.

The link is already present, but the README needs the full template structure including:

  • Table of contents
  • Compatibility sections (already partially present)
  • Quick setup with complete instructions
  • Troubleshooting section
  • Developer-oriented notes for local testing
  • Mermaid diagram showing the data flow

As per coding guidelines, the README should serve both IT professionals wanting plug-and-play solutions and developers wanting to customize.

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In project-qr-code-decoder-axis-camera-station/README.md lines 1-26, the README
is incomplete and doesn't follow the .project_readme_template.md: add TOC
markers <!-- toc --> and <!-- tocstop --> after the title, ensure the first
mention of "FixedIT Data Agent" at line 3 includes the official product page
link (replace or augment the existing link if needed), replace all TODO
placeholders (lines 9, 14, 22, 26) with concrete content from the template
(Quick Setup steps, Commands used, Compatibility details), and add the missing
required sections (Troubleshooting, Local/host testing limitations,
Implementation details/developer notes) plus a Mermaid diagram illustrating the
system flow; keep language suitable for both IT operators and developers and
follow the template ordering and headings exactly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@daniel-falk, understood! I'll keep track of the README completion as you work through the TODOs. Feel free to mention me again once you'd like me to review the completed sections.


🧠 Learnings used
Learnt from: daniel-falk
Repo: fixedit-ai/fixedit-data-agent-examples PR: 5
File: project-strobe-color-from-github-workflow/README.md:57-62
Timestamp: 2025-08-07T15:23:05.927Z
Learning: In the fixedit-ai/fixedit-data-agent-examples repository, when GitHub generates anchors for headings that contain backticks with spaces around them (like `### `filename` - Description`), it removes the backticks but preserves those spaces as dashes in the anchor, creating triple dashes (---) between the filename and description parts. TOC generators may not handle this correctly and need manual adjustment.

Learnt from: daniel-falk
Repo: fixedit-ai/fixedit-data-agent-examples PR: 24
File: README.md:31-34
Timestamp: 2025-10-15T11:33:37.276Z
Learning: In the fixedit-ai/fixedit-data-agent-examples repository, prefer automatic TOC generation using `<!-- toc

Learnt from: daniel-falk
Repo: fixedit-ai/fixedit-data-agent-examples PR: 5
File: project-strobe-color-from-github-workflow/README.md:350-351
Timestamp: 2025-08-10T14:54:48.316Z
Learning: In the fixedit-data-agent-examples repository, shell portability requirements (such as Axis devices using POSIX /bin/sh instead of bash) should be documented in a general scripting guide rather than repeated in individual project README files. This approach was confirmed by daniel-falk for better documentation organization.

Learnt from: daniel-falk
Repo: fixedit-ai/fixedit-data-agent-examples PR: 18
File: dashboard-deployments/system-monitoring-influxdb2-flux-grafana/provisioning/dashboards/overview_of_devices.json:1121-1130
Timestamp: 2025-09-03T14:18:52.406Z
Learning: When fixing unit mismatches in Grafana dashboards, daniel-falk prefers changing the panel unit configuration to match the data rather than transforming the query values, choosing simplicity over data conversion when both approaches are valid.

175 changes: 175 additions & 0 deletions project-qr-code-decoder-axis-camera-station/combined.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,175 @@
# Subscribe to the FixedIT QR Code Decoder and push to ACS ExternalDataFacade
#
# This configuration subscribes to detections from the FixedIT QR Code Decoder ACAP
# by setting up a Unix domain socket on which the FixedIT QR Code Decoder
# application can write messages to. Each detection is then posted to an ACS
# (AXIS Camera Station) server using the ExternalDataFacade API endpoint.
#
# Environment Variables:
# - READER_SOCKET_PATH: Path on which to create the Unix domain socket (defaults to
# "/dev/shm/fixedit.qr_code_decoder.sock")
# - ACS_SERVER_IP: IP address or hostname of the ACS server (required)
# - ACS_SOURCE: Source identifier for the data (required)
# - ACS_USERNAME: Username for basic authentication (required)
# - ACS_PASSWORD: Password for basic authentication (required)
# - HELPER_FILES_DIR: Directory for debug log files (required, set automatically by the FixedIT Data Agent)
# - ACS_EXTERNAL_DATA_TYPE: Type identifier for the external data (defaults to "QRCodeDetection")
# - ACS_PORT: Port number for the ACS server API (defaults to "55756")
# - CURL_INSECURE: Set to "true" to skip SSL certificate verification (defaults to "false")
# - TELEGRAF_DEBUG: Enable debug logging (defaults to "false")

[agent]
# The debug mode (true/false) which controls the verbosity of Telegraf.
debug = ${TELEGRAF_DEBUG:-false}

# Max number of metrics to buffer in memory if they can't be sent to
# the output. A too low number will result in metrics being lost if the
# output is not accessible (e.g. loss of network connection). A too large
# number might fill up the memory during a long outage and will spam the
# server once the server is back online.
metric_buffer_limit = 10000

# Set up the input that will consume QR code detections.
#
# Example JSON message format (one message per barcode/QR code detected):
# {
# "level": "INFO",
# "message": {
# "code_type": "QR-Code",
# "decoded_data": "https://fixedit.ai",
# "frame_timestamp": 1760453555,
# "image_height": 1080,
# "image_width": 1920,
# "log_type": "detection",
# "norm_height": 0.1388888955116272,
# "norm_width": 0.07656252384185791,
# "norm_center_x": 0.5833333134651184,
# "norm_center_y": 0.41111111640930176,
# "number_codes_in_frame": 2,
# "pid": "1805635",
# },
# "source": "BarcodeReader"
# }
[[inputs.socket_listener]]
# Listen on Unix domain socket for barcode detection messages
service_address = "unix://${READER_SOCKET_PATH:-/dev/shm/fixedit.qr_code_decoder.sock}"

# Set socket permissions to allow group write access
# This enables applications running as dynamic user to write to socket
socket_mode = "0664"

# Set read timeout. It's a bit unclear what effect this has, but it's probably
# the max read time for one single message. It's probably good for robustness
# to set this even though it's unlikely it would trigger when using a unix
# domain socket.
read_timeout = "30s"

# Set the buffer size of the socket. This allows some elasticity in the
# writing and reading of the socket, but the main buffering is happening
# in the Telegraf output buffer. This must absolutely not be smaller than
# the expected maximum size of a message. The maximum size of data a
# QR code can encode is about 3kB, but there might be a few messages in
# the socket if we don't have time to read them all directly.
read_buffer_size = "300KiB"

# Parse incoming data as JSON
data_format = "json_v2"

# Override measurement name for consistency in database
name_override = "barcode_reader_app"

# Default tags that will be added to all measurements
[inputs.socket_listener.tags]
input_method = "socket"

[[inputs.socket_listener.json_v2]]
# Get all fields and top-level tags
[[inputs.socket_listener.json_v2.object]]
path = "@this"

# Mark which values should be used as tags. Note that we need to reference
# them as if 'disable_prepend_keys' was not set, but the names will be
# changed before sending them so that e.g. "message_pid" becomes a tag
# called "pid".
tags = ["source"]

# Disable prepend keys. Since the input is a nested object,
# we would get "message_" prefix for all fields in the 'message'
# sub-object. By disabling prepend keys, we keep only the inner
# keys.
disable_prepend_keys = true

# Starlark processor to calculate code position and size from normalized coordinates
[[processors.starlark]]
# Filter to process only barcode reader app metrics
namepass = ["barcode_reader_app"]

# Starlark script to calculate position and size string
source = '''
# Helper function to check if all values are not None
def has_all(*values):
for value in values:
if value == None:
return False
return True

# Helper function to convert normalized value (0.0-1.0) to whole percentage
def to_whole_percent(normalized_value):
return int(normalized_value * 100 + 0.5)

def apply(metric):
# Get normalized coordinates from fields
# These are values between 0.0 and 1.0 representing position/size as fraction of image
norm_center_x = metric.fields.get("norm_center_x")
norm_center_y = metric.fields.get("norm_center_y")
norm_width = metric.fields.get("norm_width")
norm_height = metric.fields.get("norm_height")

# Only calculate if all normalized values are present
if has_all(norm_center_x, norm_center_y, norm_width, norm_height):
# Convert to percentages and round to integers
x_percent = to_whole_percent(norm_center_x)
y_percent = to_whole_percent(norm_center_y)
w_percent = to_whole_percent(norm_width)
h_percent = to_whole_percent(norm_height)

# Format as string: "pos: x=22%,y=90% size: w=5%,h=3%"
position_size = "pos: x={}%,y={}% size: w={}%,h={}%".format(x_percent, y_percent, w_percent, h_percent)
metric.fields["code_position_size"] = position_size

return metric
'''

# Set up the output that posts QR code detections to ACS ExternalDataFacade API.
#
# This configuration uses Telegraf's exec output plugin to execute the post_to_acs.sh
# script for each QR code detection. The script receives JSON data via stdin and posts
# it to the ACS server.
[[outputs.exec]]
# Filter to process only barcode reader app metrics
# This ensures the script only runs when QR code detections occur
# and ignores other metrics that might be in the pipeline
namepass = ["barcode_reader_app"]

# Shell script command to execute for each detection
command = ["${HELPER_FILES_DIR}/post_to_acs.sh"]

# Data format of the data sent to the script via stdin
data_format = "json"

# Disable batch format to send individual metrics to the script.
# When false: sends single metric JSON object per execution.
# When true: would send array of metrics (not needed for this use case).
use_batch_format = false

# Process one metric at a time for immediate response.
# This ensures each detection triggers script execution immediately.
# Higher values would batch multiple detections (undesirable for real-time processing).
metric_batch_size = 1

# Script execution timeout to prevent hanging.
# The script makes an HTTPS POST request to the ACS server.
# 10 seconds allows sufficient time for network round trips and error handling.
# Setting this makes the application more robust since it can recover from
# a hanging script.
timeout = "10s"
Loading
Loading