sharing my docker 2 stage build for production with yarn.
-
For quasar pre-V1, I was using an approach with docker to publish my apps.
for V1, this approach gave some issues. Maybe useful for other devs, so here is my new docker file:# Build stage FROM node:8 AS buildenv ENV YARN_VERSION 1.13.0 RUN curl -fSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz" \ && tar -xzf yarn-v$YARN_VERSION.tar.gz -C /opt/ \ && ln -snf /opt/yarn-v$YARN_VERSION/bin/yarn /usr/local/bin/yarn \ && ln -snf /opt/yarn-v$YARN_VERSION/bin/yarnpkg /usr/local/bin/yarnpkg \ && rm yarn-v$YARN_VERSION.tar.gz WORKDIR /generator ENV projectName "my-project" COPY ./${projectName}/package.json . COPY ./${projectName}/yarn.lock . RUN yarn COPY ./${projectName} . RUN node node_modules/@quasar/app/bin/quasar-build # Runtime stage FROM nginx ENV projectName "my-project" COPY --from=buildenv /generator/dist/spa /usr/share/nginx/html # COPY ./${projectName}/default.conf /etc/nginx/conf.d/default.conf --> in case you want nginx specific conf EXPOSE 80
Some remarks:
- I run into something very strange when i used npm (with quasar V1) on docker: npm didn’t pick up the latest quasar node_modules. So, that’s why i moved to yarn in the docker file.
- the docker file explicitely installs a version of yarn specified in the docker file.
- I needed to downgrade to node:8
- No need to install @quasar/cli in the docker container !
Please provide feedback in case you see potential improvements.
cheers -
Yarn is the only way to go. Continue to use npm for globals, though.
-
@Hawkeye64 What is precisely the problem with global yarn?
-
- There have been instances when Yarn borks everything up globally. If you haven’t hit it yet, then don’t worry about it until you do.
- Yarn puts globals into it’s own location, thereby excluding use of Node Versioning toolsets, like NVM.
At work, we have 4.5.3, 5.0/5.1 and (upcoming) 6.0 versions of our product, each of versions are using different versions of Node and associated NPM packages, so I have to use NVM to manage all that.