diff --git a/docs/design/autoware-architecture/perception/index.md b/docs/design/autoware-architecture/perception/index.md
index 9f91c74655..9ce29a3124 100644
--- a/docs/design/autoware-architecture/perception/index.md
+++ b/docs/design/autoware-architecture/perception/index.md
@@ -103,6 +103,7 @@ As mentioned in the goal session, this perception module is designed to be exten
| Camera DNN based 2D detector | This module takes camera images as input and detects objects such as vehicles, trucks, buses, pedestrians, and bicycles in the two-dimensional image space. It detects objects within image coordinates and providing 3D coordinate information is not mandatory. | - Camera Images |
| LiDAR Clustering | This module performs clustering of point clouds and shape estimation to achieve object detection without labels. | - Point Clouds |
| Semi-rule based detector | This module detects objects using information from both images and point clouds, and it consists of two components: LiDAR Clustering and Camera DNN based 2D detector. | - Output from Camera DNN based 2D detector and LiDAR Clustering |
+| Radar based 3D detector | This module takes radar data as input and detects dynamic 3D objects. In detail, please see [this document](reference-implementations/radar-based-3d-detector/radar-based-3d-detector.md). | - Radar data |
| Object Merger | This module integrates results from various detectors. | - Detected Objects |
| Interpolator | This module stabilizes the object detection results by maintaining long-term detection results using Tracking results. | - Detected Objects - Tracked Objects |
| Tracking | This module gives ID and estimate velocity to the detection results. | - Detected Objects |
diff --git a/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/faraway-object-detection.md b/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/faraway-object-detection.md
new file mode 100644
index 0000000000..c157e360b9
--- /dev/null
+++ b/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/faraway-object-detection.md
@@ -0,0 +1,64 @@
+# Radar faraway dynamic objects detection with radar objects
+
+## Overview
+
+This diagram describes the pipeline for radar faraway dynamic object detection.
+
+![faraway object detection](image/faraway-object-detection.drawio.svg)
+
+## Reference implementation
+
+### Crossing filter
+
+- [radar_crossing_objects_noise_filter](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_crossing_objects_noise_filter)
+
+This package can filter the noise objects crossing to the ego vehicle, which are most likely ghost objects.
+
+### Velocity filter
+
+- [object_velocity_splitter](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/object_velocity_splitter)
+
+Static objects include many noise like the objects reflected from ground.
+In many cases for radars, dynamic objects can be detected stably.
+To filter out static objects, `object_velocity_splitter` can be used.
+
+### Range filter
+
+- [object_range_splitter](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/object_range_splitter)
+
+For some radars, ghost objects sometimes occur for near objects.
+To filter these objects, `object_range_splitter` can be used.
+
+### Vector map filter
+
+- [object-lanelet-filter](https://github.com/autowarefoundation/autoware.universe/blob/main/perception/detected_object_validation/object-lanelet-filter.md)
+
+In most cases, vehicles drive in drivable are.
+To filter objects that are out of drivable area, `object-lanelet-filter` can be used.
+`object-lanelet-filter` filter objects that are out of drivable area defined by vector map.
+
+Note that if you use `object-lanelet-filter` for radar faraway detection, you need to define drivable area in a vector map other than the area where autonomous car run.
+
+### Radar object clustering
+
+- [radar_object_clustering](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_object_clustering)
+
+This package can combine multiple radar detections from one object into one and adjust class and size.
+It can suppress splitting objects in tracking module.
+
+![radar_object_clustering](https://raw.githubusercontent.com/autowarefoundation/autoware.universe/main/perception/radar_object_clustering/docs/radar_clustering.drawio.svg)
+
+## Note
+
+### Parameter tuning
+
+Detection performed only by Radar applies various strong noise processing.
+Therefore, there is a trade-off that if you strengthen the noise processing, things that you originally wanted to detect will disappear, and if you weaken it, your self-driving system will be unable to start because the object will be in front of you all the time due to noise.
+It is necessary to adjust parameters while paying attention to this trade-off.
+
+### Limitation
+
+- Elevated railway, vehicles for multi-level intersection
+
+If you use 2D radars (The radar can detect in xy-axis 2D coordinate, but can not have z-axis detection) and driving area has elevated railway or vehicles for multi-level intersection, the radar process detects these these and these have a bad influence to planning results.
+In addition, for now, elevated railway is detected as vehicle because the radar process doesn't have label classification feature and it leads to unintended behavior.
diff --git a/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/image/faraway-object-detection.drawio.svg b/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/image/faraway-object-detection.drawio.svg
new file mode 100644
index 0000000000..1386475f35
--- /dev/null
+++ b/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/image/faraway-object-detection.drawio.svg
@@ -0,0 +1,219 @@
+
diff --git a/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/image/radar-based-3d-detector.drawio.svg b/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/image/radar-based-3d-detector.drawio.svg
new file mode 100644
index 0000000000..85fa9a26b6
--- /dev/null
+++ b/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/image/radar-based-3d-detector.drawio.svg
@@ -0,0 +1,379 @@
+
diff --git a/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/radar-based-3d-detector.md b/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/radar-based-3d-detector.md
new file mode 100644
index 0000000000..e8992a0555
--- /dev/null
+++ b/docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/radar-based-3d-detector.md
@@ -0,0 +1,88 @@
+# Radar based 3D detector
+
+## Overview
+
+### Features
+
+Radar based 3D detector aims for the following:
+
+- Detecting objects farther than the range of LiDAR-based 3D detection.
+
+Since radar can acquire data from a longer distance than LiDAR (> 100m), when the distance of LiDAR-based 3D detection is insufficient, the radar base 3D detector can be applied.
+The detection distance of radar based 3D detection depends on the radar device specification.
+
+- Improving velocity estimation for dynamic objects
+
+Radar can get velocity information and estimate more precise twist information by fused between the objects from LiDAR-based 3D detection radar information.
+This can lead to improve for the performance of object tracking/prediction and planning like adaptive cruise control.
+
+### Whole pipeline
+
+Radar based 3D detector with radar objects consists of
+
+- 3D object detection with Radar pointcloud
+- Noise filter
+- Faraway dynamic 3D object detection
+- Radar fusion to LiDAR-based 3D object detection
+- Radar object tracking
+- Merger of tracked object
+
+![Radar based 3D detector](image/radar-based-3d-detector.drawio.svg)
+
+### Interface
+
+- Input
+ - Message type for pointcloud is `ros-perception/radar_msgs/msg/RadarScan.msg`
+ - Message type for radar objects is `autoware_auto_perception_msgs/msg/DetectedObject`.
+ - Input objects need to be concatenated.
+ - Input objects need to be compensated with ego motion.
+ - Input objects need to be transformed to `base_link`.
+- Output
+ - Tracked objects
+
+## Module
+
+### Radar pointcloud 3D detection
+
+!!! warning
+
+ Under Construction
+
+### Noise filter and radar faraway dynamic 3D object detection
+
+![faraway object detection](image/faraway-object-detection.drawio.svg)
+
+This function filters noise objects and detects faraway (> 100m) dynamic vehicles.
+The main idea is that in the case where LiDAR is used, near range can be detected accurately using LiDAR pointcloud and the main role of radar is to detect distant objects that cannot be detected with LiDAR alone.
+In detail, please see [this document](faraway-object-detection.md)
+
+### Radar fusion to LiDAR-based 3D object detection
+
+- [radar_fusion_to_detected_object](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_fusion_to_detected_object)
+
+This package contains a sensor fusion module for radar-detected objects and 3D detected objects. The fusion node can:
+
+- Attach velocity to 3D detections when successfully matching radar data. The tracking modules use the velocity information to enhance the tracking results while planning modules use it to execute actions like adaptive cruise control.
+- Improve the low confidence 3D detections when corresponding radar detections are found.
+
+### Radar object tracking
+
+!!! warning
+
+ Under Construction
+
+### Merger of tracked object
+
+!!! warning
+
+ Under Construction
+
+## Appendix
+
+### Customize own radar interface
+
+The perception interface of Autoware is defined to `DetectedObjects`, `TrackedObjects`, and `PredictedObjects`, however, other message is defined by own cases. For example, [DetectedObjectWithFeature](https://github.com/tier4/tier4_autoware_msgs/tree/tier4/universe/tier4_perception_msgs/msg/object_recognition) is used by customized message in perception module.
+
+Same as that, you can adjust new radar interface.
+For example, `RadarTrack` doesn't have orientation information [from past discussions](https://github.com/ros-perception/radar_msgs/pull/3), especially [this discussion](https://github.com/ros-perception/radar_msgs/pull/3#issuecomment-661599741).
+If you want orientation information, you can adapt radar ROS driver to publish directly to `TrackedObject`.