Robotics
Transformers
ONNX
PyTorch
ballnet
metaball
multimodal
custom_code
File size: 2,614 Bytes
26abecd
8fe82e6
 
 
 
 
 
 
 
 
 
 
26abecd
8fe82e6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---

license: bsd-3-clause
pipeline_tag: robotics
tags:
  - ballnet
  - metaball
  - multimodal
  - onnx
  - pytorch
library_name: transformers
datasets:
  - asRobotics/ballnet-100k
---


# Model Card for BallNet

## Table of Contents

- [Model Card for BallNet](#model-card-for-ballnet)
  - [Table of Contents](#table-of-contents)
  - [Model Description](#model-description)
  - [Intended Use](#intended-use)
  - [Training Data](#training-data)
  - [Citation](#citation)

## Model Description

BallNet is an MLP model designed for the Metaball. It can predict both 6D force and 3D shape (mesh nodes) from the 6D motion of the ball.

Try it out on the [Spaces demo](https://huggingface.co/spaces/asRobotics/ballnet-demo).

- Developer: Xudong Han, Tianyu Wu, Fang Wan, and Chaoyang Song.
- Model type: MLP
- License: BSD-3-Clause

## Intended Use

This model is intended for researchers and developers working in robotics and tactile sensing. It can be used to enhance the capabilities of robotic systems by providing accurate predictions of force and shape based on tactile data.

To load the model:

```python

from transformers import AutoModel



model = AutoModel.from_pretrained("asRobotics/ballnet", trust_remote_code=True)

x = torch.zeros((1, 6))  # Example input: batch size of 1, 6D motion

output = model(x)

```

Or to load the ONNX version:

```python

# Example code to load onnx

import onnxruntime as ort

import numpy as np

from huggingface_hub import hf_hub_download



onnx_model_path = hf_hub_download("asRobotics/ballnet", filename="model.onnx")

ort_session = ort.InferenceSession(onnx_model_path)



# Example input

x = np.zeros((1, 6), dtype=np.float32)  # Batch size of 1, 6D motion

output = ort_session.run(None, {"motion": x})

```

## Training Data

The model was trained on the [BallNet-100K](https://huggingface.co/datasets/asRobotics/ballnet-100k) dataset, which includes a variety of motion, force, and shape data collected by finite element simulations.

## Citation

If you use this model in your research, please cite the following papers:

```bibtex

@article{liu2024proprioceptive,

  title={Proprioceptive learning with soft polyhedral networks},

  author={Liu, Xiaobo and Han, Xudong and Hong, Wei and Wan, Fang and Song, Chaoyang},

  journal={The International Journal of Robotics Research},

  volume = {43},

  number = {12},

  pages = {1916-1935},

  year = {2024},

  publisher={SAGE Publications Sage UK: London, England},

  doi = {10.1177/02783649241238765}

}

```

[](https://arxiv.org/abs/2308.08538)