πΈ Embedding a camera experience within your own app should't be that hard.
A flutter plugin to integrate awesome Android / iOS camera experience.
This packages provides you a fully customizable camera experience that you can use within your app.
Use our awesome built in interface or customize it as you want.
Here's all native features that cameraAwesome provides to the flutter side.
System | Android | iOS |
---|---|---|
π Ask permissions | β | β |
π₯ Record video | β | β |
π Enable/disable audio | β | β |
π Take photos | β | β |
π Photo live filters | β | β |
π€ Exposure level | β | β |
π‘ Broadcast live image stream | β | β |
π Zoom | β | β |
πΈ Device flash support | β | β |
βοΈ Auto focus | β | β |
π² Live switching camera | β | β |
π΅βπ« Camera rotation stream | β | β |
π€ Background auto stop | β | β |
π Sensor type switching | βοΈ | β |
dependencies:
camerawesome: ^1.2.0
...
- iOS add these on
ios/Runner/Info.plist
file
<key>NSCameraUsageDescription</key><string>Your own description</string>
<key>NSMicrophoneUsageDescription</key><string>To enable microphone access when recording video
</string>
<key>NSLocationWhenInUseUsageDescription</key><string>To enable GPS location access for Exif data
</string>
- Android
Change the minimum SDK version to 21 (or higher) in android/app/build.gradle
minSdkVersion 21
Γ¬f you want to record videos with audio, add this permission to your AndroidManifest.xml
:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.yourpackage">
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<!-- Other declarations -->
</manifest>
You may also want to save location of your pictures in exif metadata. In this case, add below permissions:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.yourpackage">
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<!-- Other declarations -->
</manifest>
import 'package:camerawesome/camerawesome_plugin.dart';
Just use our builder.
That's all you need to create a complete camera experience within you app.
CameraAwesomeBuilder.awesome(
saveConfig: SaveConfig.image(
pathBuilder: _path(),
),
onMediaTap: (mediaCapture) {
OpenFile.open(mediaCapture.filePath);
},
),
Our builder provides a custom factory.
Now you have access to the builder property and can create your own camera experience.
The camera preview will be visible behind what you will provide to our builder.
Note
Only the camera preview is not customizable yet
CameraAwesomeBuilder.custom(
saveConfig: SaveConfig.image(pathBuilder: _path()),
builder: (state, previewSize, previewRect) {
// create your interface here
},
)
See more in documentation
Here is the definition of our builder method.
typedef CameraLayoutBuilder = Widget Function(CameraState cameraState, PreviewSize previewSize, Rect previewRect);
The only thing you have access to manage the camera is the cameraState.
Depending on which state is our camera experience you will have access to some different method.
```previewSize``` and ```previewRect``` might be used to position your UI around or on top of the camera preview.
Using the state you can do anything you need without having to think about the camera flow
- On app start we are in
PreparingCameraState
- Then depending on the initialCaptureMode you set you will be
PhotoCameraState
orVideoCameraState
- Starting a video will push a
VideoRecordingCameraState
- Stopping the video will push back the
VideoCameraState
Also if you want to use some specific function you can use the when method so you can write like this.
state.when(
onPhotoMode: (photoState) => photoState.start(),
onVideoMode: (videoState) => videoState.start(),
onVideoRecordingMode: (videoState) => videoState.pause(),
);
See more in documentation
Use this to achieve
- QR-Code scanning.
- Facial recognition.
- AI object detection.
- Realtime video chats. And much more π€©
You can check examples using MLKit inside the example
directory.
ai_analysis_faces.dart
is used to detect faces and ai_analysis_barcode.dart
to read
barcodes.
CameraAwesomeBuilder.awesome(
saveConfig: SaveConfig.image(
pathBuilder: _path(),
),
onImageForAnalysis: analyzeImage,
imageAnalysisConfig: AnalysisConfig(
outputFormat: InputAnalysisImageFormat.nv21, // choose between jpeg / nv21 / yuv_420 / bgra8888
width: 1024,
maxFramesPerSecond: 30,
),
MLkit recommands to use nv21 format for Android.
bgra8888 is the iOS format For machine learning you don't need full resolution images (1024 or lower should be enough and makes computation easier)
See more in documentation
Through state you can access to a SensorConfig
class.
Function | Comment |
---|---|
setZoom | changing zoom |
setFlashMode | changing flash between NONE,ON,AUTO,ALWAYS |
setBrightness | change brightness level manually (better to let this auto) |
All of this configurations are listenable through a stream so your UI can automatically get updated according to the actual configuration.
Apply live filters to your pictures using the built-in interface:
You can also choose to use a specific filter from the start:
CameraAwesomeBuilder.awesome(
// other params
filter: AwesomeFilter.AddictiveRed,
)
Or set the filter programmatically:
CameraAwesomeBuilder.custom(
builder: (cameraState, previewSize, previewRect) {
return cameraState.when(
onPreparingCamera: (state) =>
const Center(child: CircularProgressIndicator()),
onPhotoMode: (state) =>
TakePhotoUI(state, onFilterTap: () {
state.setFilter(AwesomeFilter.Sierra);
}),
onVideoMode: (state) => RecordVideoUI(state, recording: false),
onVideoRecordingMode: (state) =>
RecordVideoUI(state, recording: true),
);
},
)
See all available filters in the documentation.