Software Engineering Asked by user3243135 on October 29, 2021
Say that I am developing a web application that has the following structure:
Currently, I develop this app with all three parts running separately… I run the frontend using Angular’s dev server, I run the middleware as a standalone process (with nodemon), and I have a development database against which I run scripts manually.
This works “ok” as long as I am working solo, but it is quickly become unmanageable as I try to bring on other frontend devs (I will still be the only middleware/database guy). For example, I have put instances of the middleware and database on a development server that’s accessible to all. However, this makes it hard for me to make changes (the data model and API are changing rapidly) because it may break whatever the others are working on. I think I need to be able to version the api and database, but I’m not exactly sure how to do that. Also, there may be problems with version skew between the components.
One thing I’ve thought about is putting the middleware and database (with all data preloaded) into a docker container and having the other frontend devs run it with docker compose or something. However, I’m not sure how well that would work on windows. I don’t have the expertise to run something more complicated like kubernetes at the moment. Also, in the long term, I’m not sure I want to deploy in containers because putting a database in docker does not seem ‘right’ to me for some reason (maybe I’m biased).
Any advice on the correct development workflow and/or project structure and/or products and services that might help?
Say there is a branch named develop which developers integrate with frequently. I usually find it valuable to have a CD process deploy develop builds to a shared machine that's accessibly by all. That CD process needs to be protected by a quality gate, for example we usually don't allow merges into develop if unit tests fail and integration test will run after develop has been deployed.
Depending on your development velocity and frequency of integrations, develop should remain stable enough for other feature development to occur against that. By that I mean front end can test a feature branch against the state of back-end that's made it into develop, and back-end can test a feature branch against a state of front-end that's in develop.
For breaking changes, feature flags are a useful capability to have. This way one team can work on a breaking change and turn that on based on their needs while everyone else remains unaffected.
For APIs, API fakes can be really helpful. API fakes are dumb servers that spew out pre-defined content with no business logic. I think API fakes are frequently used to test Angular code with no back-end dependency.
Answered by Martin K on October 29, 2021
We have been using docker for development and testing and it works great. The docker-compose files are stored in git as are all of the set up/configuration scripts so everyone is working with the same setup. Maven has plugins to run containers as part of automation integration tests which makes life a lot easier.
NOTE: Be sure to spec explicit versions of the docker images you are using; don't just use the "latest" tag.
You do not need to think about kubernetes during development.
Whether you deploy production using containers or not does not affect day-to-day development: your software doesn't care if the other end of a network connection is running in a lightweight container, full-on VM or bare-metal machine.
For what it's worth, I think you're biased against containers for some reason; we've run databases in containers with and without kubernetes with no problems. The data has to be stored in a persistent volume, but that's easily configured.
Answered by Matthew on October 29, 2021
You definitely want to figure out way to make it easy to run the whole solution locally on development machine.
You mention "I'm not sure how well that would work on windows". Maybe you want to limit all your developers to only use Linux? That would simplify things greatly. Then, you can have bunch of bash scripts that setup everything locally for development.
You mention Docker. Docker on Windows can easily run Linux images. So it might simplify deployment and running of pieces of your solution locally.
I recommend to run an experiment: Completely clear your development environment. Clean OS installation. Check out code from source control. After that, how long does it take to setup your development environment and how much of that is manual work? In ideal situation, you should run single batch script and it should setup your whole environment. It will setup all the services, build the source code, run automated tests, maybe even sets up IDEs with reasonable team-wide defaults.
Answered by Euphoric on October 29, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP