Skip to content

Commit 03fa15d

Browse files
Merge branch 'main' into cf-e2e
2 parents 3f0d553 + c18e509 commit 03fa15d

File tree

8 files changed

+2030
-3
lines changed

8 files changed

+2030
-3
lines changed

docs/server_configuration/environmental_variables.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ Environmental variable | Description
1515
`RUNS_ON_JETSON` | Boolean flag to tell if `inference` runs on Jetson device - set to `True` in all docker builds for Jetson architecture. | False
1616
`WORKFLOWS_DEFINITION_CACHE_EXPIRY` | Number of seconds to cache Workflows definitions as a result of `get_workflow_specification(...)` function call | `15 * 60` - 15 minutes
1717
`DOCKER_SOCKET_PATH` | Path to the local socket mounted to the container - by default empty, if provided - enables pooling docker container stats from the docker deamon socket. See more [here](./server_configuration/container_statistics.md) | Not Set
18-
`ENABLE_PROMETHEUS` | Boolean flag to enable Prometeus `/metrics` enpoint. | True for docker images in dockerhub
18+
`ENABLE_PROMETHEUS` | Boolean flag to enable Prometheus `/metrics` enpoint. | True for docker images in dockerhub
1919
`ENABLE_STREAM_API` | Flag to enable Stream Management API in `inference` server - see [more](/workflows/video_processing/overview.md). | False
2020
`STREAM_API_PRELOADED_PROCESSES` | In context of Stream API - this environment variable controlls how many idle processes are warmed-up ready to be a worker for `InferencePipeline` - helps speeding up workers processes start on GPU | 0
2121
`TRANSIENT_ROBOFLOW_API_ERRORS` | List of (comma separated) HTTP codes from RF API that should be retried (only applicable to GET endpoints) | `None`

docs/server_configuration/service_telemetry.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,11 +14,11 @@ Service telemetry provides essential real-time data on system health, performanc
1414

1515
In `inference` server, we enabled:
1616

17-
* [`prometeus`](https://prometheus.io/) metrics
17+
* [`prometheus`](https://prometheus.io/) metrics
1818

1919
* docker container metrics provided by Docker daemon
2020

21-
## 🔥 [`prometeus`](https://prometheus.io/) in `inference` server
21+
## 🔥 [`prometheus`](https://prometheus.io/) in `inference` server
2222

2323
To enable metrics, set environmental variable `ENABLE_PROMETHEUS=True` in your docker container:
2424

inference/core/workflows/core_steps/loader.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -270,6 +270,7 @@
270270
EmailNotificationBlockV1,
271271
)
272272
from inference.core.workflows.core_steps.sinks.local_file.v1 import LocalFileSinkBlockV1
273+
from inference.core.workflows.core_steps.sinks.onvif_movement.v1 import ONVIFSinkBlockV1
273274
from inference.core.workflows.core_steps.sinks.roboflow.custom_metadata.v1 import (
274275
RoboflowCustomMetadataBlockV1,
275276
)
@@ -653,6 +654,7 @@ def load_blocks() -> List[Type[WorkflowBlock]]:
653654
SmolVLM2BlockV1,
654655
Moondream2BlockV1,
655656
OverlapBlockV1,
657+
ONVIFSinkBlockV1,
656658
]
657659

658660

inference/core/workflows/core_steps/sinks/onvif_movement/__init__.py

Whitespace-only changes.

0 commit comments

Comments
 (0)