What we're building
We'll build a cross-platform mobile app for taking photos and uploading to firebase.
In Part 1, we'll take a picture and save it to Firebase Cloud Storage, and then show it in our app.
In Part 2, we'll offload the uploading work, and use the blueimp library to generate a thumbnail locally and show it while uploading.
The stack
- Vue JS - Component framework
- Cordova - Cross platform mobile framework
- Quasar - UI framework (and CLI)
- Firebase Cloud Storage - For storing the photos
- Web Workers - For offloading the uploading to a separate thread
Scaffolding
We'll use Quasar CLI to initialize a new project, and run the cordova mode (android or ios) to see the app running on your connected device.
quasar create vue-firebase-image-upload
cd vue-firebase-image-upload
quasar dev -m android
You'll have to add
https: true
in thedevServer
section ofquasar.conf.js
if running on android > 9
You should get a basic working version that looks like this:
This is a good time to make your first commit.
For more details on how to install and setup quasar, see this post
Taking a picture and getting base64
The way to store images in Firebase Cloud Storage is by saving the base-64 string of the image, using Firebase's putString
method.
Notice that you have to remove the base64 prefix from the string before uploading, or Firebase will reject the string.
To begin, we'll add a button that takes a picture using Cordova's camera plugin, and print the base64 string.
Adding plugins
First, add cordova camera plugin and file plugin:
cd src-cordova
cordova plugin add cordova-plugin-file
cordova plugin add cordova-plugin-camera
Taking the picture
In order to take a picture, we'll add some code in a new service file src/services/cordova-camera.js
:
async function getCameraFileObject() {
return new Promise((resolve, reject) => {
let camera = navigator.camera;
const options = {
quality: 50,
destinationType: camera.DestinationType.FILE_URI,
encodingType: camera.EncodingType.JPG,
mediaType: camera.MediaType.PICTURE,
saveToPhotoAlbum: true,
correctOrientation: true
};
camera.getPicture(imageURI => {
window.resolveLocalFileSystemURL(imageURI,
function (fileEntry) {
fileEntry.file(
function (fileObject) {
resolve(fileObject)
},
function (err) {
console.error(err);
reject(err);
}
);
},
function () { }
);
},
console.error,
options
);
})
}
async function getBase64FromFileObject(fileObject) {
return new Promise((resolve, reject) => {
var reader = new FileReader()
reader.onloadend = function (evt) {
var image = new Image()
image.onload = function (e) {
resolve(evt.target.result)
}
image.src = evt.target.result
}
reader.readAsDataURL(fileObject)
})
}
async function getBase64FromCamera() {
let fileObject = await getCameraFileObject();
let base64 = await getBase64FromFileObject(fileObject);
return base64;
}
export default {
getBase64FromCamera
}
This service is doing several steps:
- Takes a picture using Cordova's camera plugin. This returns an image URI.
- Gets a file entry using Cordova's file plugin, with
resolveLocalFileSystemURL
function. - Gets a File object using the
file
method. - Uses
FileReader
to get the base64 representation of the file.
Now let's add a simple event bus. Add the file src/services/event-bus.js
with content:
import Vue from 'vue';
export const EventBus = new Vue();
Now in src/layouts/MyLayout.vue
, we'll add a button in the toolbar for taking a picture, and use our event bus to send the handling to Index.vue
:
<template>
...
<q-btn flat dense round @click="takePicture">
<q-icon name="camera" />
</q-btn>
...
</template>
<script>
import { EventBus } from "../services/event-bus.js";
export default {
...
methods: {
takePicture() {
EventBus.$emit('takePicture')
}
}
}
</script>
Finally we'll catch the takePicture
event in our main component: Index.vue
, call the cordova-camera.js
service function and print the base64 result (which we'll later upload to firebase):
In Index.vue
file in the <script>
section:
import { EventBus } from "../services/event-bus";
import cordovaCamera from "../services/cordova-camera";
export default {
name: "PageIndex",
mounted() {
EventBus.$off("takePicture");
EventBus.$on("takePicture", this.uploadImageFromCamera);
},
methods: {
async uploadImageFromCamera() {
let base64 = await cordovaCamera.getBase64FromCamera();
console.log("base64", base64)
}
}
};
Running with quasar dev -m android
, and looking at the chrome dev tools, we can see the base64 output as a long string printed in the console:
Adding firebase
Web SDK or native SDK?
When using Firebase in a Cordova app, you can choose between using the Web SDK and a cordova plugin the wraps the Native SDKs. The state of the current Firebase cordova plugins seems a little unstable to me, not fully supporting Firebase Cloud Storage yet. And with the ability to offload work to a web worker, we can use the full-featured official Web SDK, and not take the performance hit.
Add the firebase sdk using yarn:
yarn add firebase
Firebase setup
Follow the instructions in the Firebase Cloud Storage setup page.
After the setup you should have a firebase project with Storage enabled. We'll save the settings in a separate file: firebase-config.js
(this file won't be committed to our repo). The contents should look something like this:
export default {
firebase: {
apiKey: '<your-api-key>',
authDomain: '<your-auth-domain>',
databaseURL: '<your-database-url>',
storageBucket: '<your-storage-bucket-url>'
}
}
Uploading the image
Now that we have Firebase settings in place, we'll add another service for handling Firebase app initialization and uploading.
Add the file src/services/cloud-storage.js
with content:
import * as firebase from "firebase/app";
import 'firebase/storage';
import firebaseConfig from './firebase-config';
async function uploadBase64(imageData, storageId) {
return new Promise((resolve, reject) => {
let uploadTask = firebase.storage().ref().child(storageId).putString(imageData, "base64");
uploadTask.on(
"state_changed",
function (snapshot) {},
function (error) {
reject(error)
},
function () {
uploadTask.snapshot.ref
.getDownloadURL()
.then(function (downloadURL) {
console.log("Uploaded a blob or file!");
console.log("got downloadURL: ", downloadURL);
resolve(downloadURL);
});
}
);
});
}
function initialize() {
firebase.initializeApp(firebaseConfig.firebase)
}
export default {
uploadBase64,
initialize
}
The uploadBase64
function uploads imageData
(the base64 string) using the putString
method, under the Firebase child of storageId
. You can have any ID you want, here I've chosen to use the unix datetime number as the child ID. After the uploading is done, we return the image URL using getDownloadURL
function of the uploading task.
In our main App.vue
file we'll call the initialize function when the app is mounted:
import cloudStorage from './services/cloud-storage'
export default {
name: 'App',
mounted() {
cloudStorage.initialize();
}
}
Finally we'll call the uploading function from Index.vue
. This is now the content of Index.vue
:
<template>
<q-page>
<div class="row justify-center q-ma-md" v-for="(pic, idx) in pics" :key="idx">
<div class="col">
<q-card>
<q-img spinner-color="white" :src="pic" />
</q-card>
</div>
</div>
</q-page>
</template>
<script>
import { EventBus } from "../services/event-bus";
import cordovaCamera from "../services/cordova-camera";
import cloudStorage from "../services/cloud-storage";
export default {
name: "PageIndex",
data() {
return {
pics: []
};
},
mounted() {
EventBus.$off("takePicture");
EventBus.$on("takePicture", this.uploadImageFromCamera);
},
methods: {
removeBase64Prefix(base64Str) {
return base64Str.substr(base64Str.indexOf(",") + 1);
},
async uploadImageFromCamera() {
const base64 = await cordovaCamera.getBase64FromCamera();
const imageData = this.removeBase64Prefix(base64);
const storageId = new Date().getTime().toString();
const uploadedPic = await cloudStorage.uploadBase64(imageData, storageId);
this.pics.push(uploadedPic);
}
}
};
</script>
We've added the pics
array to our component's data
, containing URLs of all the pictures we uploaded. We've added a v-for
that will show each picture in a quasar q-img
component inside a q-card
component.
We've also changes uploadImageFromCamera
to take a picture, get the base64
string (stripping the prefix so Firebase won't freak out), calculated the storageId
using the current datetime, and called the uploading function. After it's uploaded, we add the resulting URL to the pics
array.
And we can now see the uploaded image:
The full code is on GitHub.
In the next part of the series we'll move tasks to a web worker, add a loading spinner, and save the URLs in local storage. Stay tuned!
所有评论(0)