Skip to content

Commit 1360c2b

Browse files
authored
Merge branch 'main' into cf-e2e
2 parents a7d1495 + 93a5cda commit 1360c2b

File tree

8 files changed

+32
-34
lines changed

8 files changed

+32
-34
lines changed

README.md

Lines changed: 12 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ pip install inference-cli && inference server start --dev
5454

5555
This will pull the proper image for your machine and start it in development mode.
5656

57-
In development mode, a Jupyter notebook server with a quickstart guide runs on
57+
In development mode, a Jupyter notebook server with a quickstart guide runs on
5858
[http://localhost:9001/notebook/start](http://localhost:9001/notebook/start). Dive in there for a whirlwind tour
5959
of your new Inference Server's functionality!
6060

@@ -86,7 +86,7 @@ Workflows allow you to extend simple model predictions to build computer vision
8686
<!-- Left cell (thumbnail) -->
8787
<td width="300" valign="top">
8888
<a href="https://youtu.be/aPxlImNxj5A">
89-
<img src="https://img.youtube.com/vi/aPxlImNxj5A/0.jpg"
89+
<img src="https://img.youtube.com/vi/aPxlImNxj5A/0.jpg"
9090
alt="Self Checkout with Workflows" width="300" />
9191
</a>
9292
</td>
@@ -104,7 +104,7 @@ Workflows allow you to extend simple model predictions to build computer vision
104104
<tr>
105105
<td width="300" valign="top">
106106
<a href="https://youtu.be/r3Ke7ZEh2Qo">
107-
<img src="https://img.youtube.com/vi/r3Ke7ZEh2Qo/0.jpg"
107+
<img src="https://img.youtube.com/vi/r3Ke7ZEh2Qo/0.jpg"
108108
alt="Workflows Tutorial" width="300" />
109109
</a>
110110
</td>
@@ -116,7 +116,7 @@ Workflows allow you to extend simple model predictions to build computer vision
116116
</strong><br />
117117
<strong>Created: 6 Jan 2025</strong><br /><br />
118118
Learn how to build and deploy Workflows for common use-cases like detecting
119-
vehicles, filtering detections, visualizing results, and calculating dwell
119+
vehicles, filtering detections, visualizing results, and calculating dwell
120120
time on a live video stream.
121121
</td>
122122
</tr>
@@ -125,7 +125,7 @@ Workflows allow you to extend simple model predictions to build computer vision
125125
<!-- Left cell (thumbnail) -->
126126
<td width="300" valign="top">
127127
<a href="https://youtu.be/tZa-QgFn7jg">
128-
<img src="https://img.youtube.com/vi/tZa-QgFn7jg/0.jpg"
128+
<img src="https://img.youtube.com/vi/tZa-QgFn7jg/0.jpg"
129129
alt="Smart Parking with AI" width="300" />
130130
</a>
131131
</td>
@@ -143,7 +143,7 @@ Workflows allow you to extend simple model predictions to build computer vision
143143
</table>
144144

145145
## 📟 connecting via api
146-
146+
147147
Once you've installed Inference, your machine is a fully-featured CV center.
148148
You can use its API to run models and workflows on images and video streams.
149149
By default, the server is running locally on
@@ -334,42 +334,36 @@ We would love your input to improve Roboflow Inference! Please see our [contribu
334334
<img
335335
src="https://media.roboflow.com/notebooks/template/icons/purple/youtube.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634652"
336336
width="3%"
337-
/>
338-
</a>
337+
/></a>
339338
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
340339
<a href="https://roboflow.com">
341340
<img
342341
src="https://media.roboflow.com/notebooks/template/icons/purple/roboflow-app.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949746649"
343342
width="3%"
344-
/>
345-
</a>
343+
/></a>
346344
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
347345
<a href="https://www.linkedin.com/company/roboflow-ai/">
348346
<img
349347
src="https://media.roboflow.com/notebooks/template/icons/purple/linkedin.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633691"
350348
width="3%"
351-
/>
352-
</a>
349+
/></a>
353350
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
354351
<a href="https://docs.roboflow.com">
355352
<img
356353
src="https://media.roboflow.com/notebooks/template/icons/purple/knowledge.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634511"
357354
width="3%"
358-
/>
359-
</a>
355+
/></a>
360356
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
361357
<a href="https://disuss.roboflow.com">
362358
<img
363359
src="https://media.roboflow.com/notebooks/template/icons/purple/forum.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633584"
364360
width="3%"
365-
/>
361+
/></a>
366362
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
367363
<a href="https://blog.roboflow.com">
368364
<img
369365
src="https://media.roboflow.com/notebooks/template/icons/purple/blog.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633605"
370366
width="3%"
371-
/>
372-
</a>
373-
</a>
367+
/></a>
374368
</div>
375369
</div>

docs/install/index.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ Simply download the latest installer for your operating system. You can find th
1616
- [Download the Roboflow Inference DMG](https://github.com/roboflow/inference/releases) disk image
1717
- Mount hte disk image by double clicking it
1818
- Drag the Roboflow Inference App to the Application Folder
19-
- Go to your Application Folder and double click the Roboflow Infernce App to start the server
19+
- Go to your Application Folder and double click the Roboflow Inference App to start the server
2020

2121

2222

@@ -34,7 +34,7 @@ notifications).
3434

3535
## Run via Docker
3636

37-
The preferred way to use Inference is via Docker
37+
The preferred way to use Inference is via Docker
3838
(see [Why Docker](/understand/architecture.md#why-docker)).
3939

4040
[Install Docker](https://docs.docker.com/engine/install/) (and
@@ -46,7 +46,7 @@ pip install inference-cli
4646
inference server start
4747
```
4848

49-
The `inference server start` command attempts to automatically choose
49+
The `inference server start` command attempts to automatically choose
5050
and configure the optimal container to optimize performance on your machine.
5151
See [Using Your New Server](#using-your-new-server) for next steps.
5252

@@ -57,7 +57,7 @@ See [Using Your New Server](#using-your-new-server) for next steps.
5757
## Dev Mode
5858

5959
The `--dev` parameter to `inference server start` starts in development mode.
60-
This spins up a companion Jupyter notebook server with a quickstart guide on
60+
This spins up a companion Jupyter notebook server with a quickstart guide on
6161
[`localhost:9002`](http://localhost:9002). Dive in there for a whirlwind tour
6262
of your new Inference Server's functionality!
6363

docs/install/mac.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,16 +2,16 @@
22

33
## OSX Native App (Apple Silicon)
44

5-
You can now run Roboflow Inference Server on your Apple Silicon Mac using our native desktop app!
5+
You can now run Roboflow Inference Server on your Apple Silicon Mac using our native desktop app!
66

7-
Simply download the latest DMS disk image from the latest release on Github.
7+
Simply download the latest DMS disk image from the latest release on Github.
88
➡️ **[View Latest Release and Download Installers on Github](https://github.com/roboflow/inference/releases)**
99

10-
### OSX Installation Steps
10+
### OSX Installation Steps
1111
- [Download the Roboflow Inference DMG](https://github.com/roboflow/inference/releases) disk image
1212
- Mount hte disk image by double clicking it
1313
- Drag the Roboflow Inference App to the Application Folder
14-
- Go to your Application Folder and double click the Roboflow Infernce App to start the server
14+
- Go to your Application Folder and double click the Roboflow Inference App to start the server
1515

1616
## Using Docker
1717
=== "CPU"

inference/core/interfaces/stream_manager/manager_app/communication.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,15 +29,15 @@ def receive_socket_data(
2929
private_message=f"Header is indicating non positive payload size: {payload_size}",
3030
public_message=f"Header is indicating non positive payload size: {payload_size}",
3131
)
32-
received = b""
32+
received = bytearray()
3333
while len(received) < payload_size:
3434
chunk = source.recv(buffer_size)
3535
if len(chunk) == 0:
3636
raise TransmissionChannelClosed(
3737
private_message="Socket was closed to read before payload was decoded.",
3838
public_message="Socket was closed to read before payload was decoded.",
3939
)
40-
received += chunk
40+
received.extend(chunk)
4141
try:
4242
return json.loads(received)
4343
except ValueError as error:

inference/core/workflows/core_steps/fusion/detections_consensus/v1.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
import statistics
33
from collections import Counter
44
from enum import Enum
5+
from functools import lru_cache
56
from typing import Dict, Generator, List, Literal, Optional, Set, Tuple, Type, Union
67
from uuid import uuid4
78

@@ -176,6 +177,7 @@ def get_parameters_accepting_batches(cls) -> List[str]:
176177
return ["predictions_batches"]
177178

178179
@classmethod
180+
@lru_cache(maxsize=None)
179181
def describe_outputs(cls) -> List[OutputDefinition]:
180182
return [
181183
OutputDefinition(

inference/models/rfdetr/rfdetr.py

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -316,10 +316,12 @@ def postprocess(
316316

317317
boxes_xyxy = boxes_input / scale
318318

319-
boxes_xyxy[:, 0] = np.clip(boxes_xyxy[:, 0], 0, orig_w)
320-
boxes_xyxy[:, 1] = np.clip(boxes_xyxy[:, 1], 0, orig_h)
321-
boxes_xyxy[:, 2] = np.clip(boxes_xyxy[:, 2], 0, orig_w)
322-
boxes_xyxy[:, 3] = np.clip(boxes_xyxy[:, 3], 0, orig_h)
319+
np.clip(
320+
boxes_xyxy,
321+
[0, 0, 0, 0],
322+
[orig_w, orig_h, orig_w, orig_h],
323+
out=boxes_xyxy,
324+
)
323325

324326
batch_predictions = np.column_stack(
325327
(

requirements/requirements.sam.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
rf-segment-anything==1.0
22
samv2==0.0.4
33
rasterio~=1.4.0
4-
torch>=2.0.1,<2.7.0
4+
torch>=2.0.1,<2.8.0
55
torchvision>=0.15.2

requirements/requirements.transformers.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
torch>=2.0.1,<2.7.0
1+
torch>=2.0.1,<2.8.0
22
torchvision>=0.15.0
33
transformers>=4.50.0,<4.52.0
44
timm~=1.0.0

0 commit comments

Comments
 (0)