Flutter + Firebase Functions and Hosting recommended folder structure - firebase

As my project grows I'm wanting to setup an elegant and scalable folder structure but I'm unsure what the best practices are.
My project includes a
Flutter app
Firebase Functions
Firebase Hosting with a static landing page and will soon have 2 Angular apps
My current structure is this:
Root flutter app
|
+-- README.md
+-- package.json (with flutter build commands)
+-- Flutter code (lib, test assets etc)
|
+-- Firebase
| |
| +-- package.json (with firebase build commands)
| +-- Functions
| | |
| | +-- src (functions code)
| | |
| +-- Firestore
| | |
| | +-- firestore.rules
| | +-- firestore.indexes
Should my root directory look like this:
- Flutter
- Firebase (Functions, hosting, firestore)
Or something more like this:
- Flutter
- Functions
- Hosting (App1, App2, Static landing etc)
- Firestore
But then where is best to keep build files and deploy scripts for each site/service/app?
I haven't implemented CI/CD yet, but I plan to implement it soon, so I'd probably want to take that into consideration.
Thanks for any advice!

Related

Creating Wordpress "Stand alone" plugin in Themosis Framework

Themosis Framework provides (laravel like) MVC type development environment for wordpress.
It all look great, but just got stuck in developing a "stand alone" plugin, because plugin look for configuration files in higher level directory.
This is how root directory look like
+-- app/
| +-- Console
| +-- Exceptions
| +-- Forms
| +-- Hooks
| +-- Http
| +-- Controllers
| +-- Mail
| +-- Providers
| +-- Widgets
+-- bootstrap/
+-- config/
| +-- app.php
| +-- ...
| +-- wordpress.php
+-- database/
+-- htdocs/
| +-- cms/
| +-- content/
| +-- index.php
| +-- wp-config.php
+-- resources/
| +-- languages/
| +-- views/
+-- routes/
| +-- console.php
| +-- web.php
+-- storage/
+-- tests/
+-- vendor/
+-- .env
+-- composer.json
+-- console
+-- wp-cli.yml
this is how plugin directory look like in /htdocs/content/plugins:
+-- assets
+-- config
+-- dist
+-- inc
+-- languages
+-- resources
+-- views
+-- package.json
+-- plugin-name.php
+-- routes.php
+-- webpack.mix.js
Themosis has interesting feature of compiling (although its php based framework) so I thought compiling will create some asset to make plugin work as standalone plugin.
but it seems error (plugin looking for resources stored in higher level directory) still remain
Would anyone know if making "standalone" plugin is possible using themosis framework?
Thank you!

NextJS On-demand entries Returns 404

I'm currently experiencing an issue with on-demand entries where when it requests a page to be compiled on the fly, it returns a 404 from the dev-server.
Below is the typical pages directory structure. When inspecting dev_tools console, there is a fetch call to the dev server to compile the page, in this case article.tsx (/article/article-slug), the fetch request returns a 404.
.
+-- pages
| +-- latest-news
| +-- index.tsx
| +-- article
| +-- [slug].tsx
Environment Details:
OS: Ubuntu 18.04.4 LTS
Node: v15.8.0
NPM: 6.14.11
NextJS: 10.0.5
You need a pages/article/[article-slug].tsx page if you want NextJS to recognize the slug.
Follow the NextJS tutorial for more details

Writing and importing custom plugins in Airflow

This is actually two questions combined into one.
My AIRFLOW_HOME is structured like
airflow
+-- dags
+-- plugins
+-- __init__.py
+-- hooks
+-- __init__.py
+-- my_hook.py
+-- another_hook.py
+-- operators
+-- __init__.py
+-- my_operator.py
+-- another_operator.py
+-- sensors
+-- utils
I've been following astronomer.io's examples here https://github.com/airflow-plugins. My custom operators use my custom hooks, and all the imports are relative to the top level folder plugins.
# my_operator.py
from plugins.hooks.my_hook import MyHook
However, when I tried moving my entire repository into the plugins folder, I get an import error after running airflow list_dags saying that plugins cannot be found.
I read up a little about it and apparently Airflow loads the plugins into its core module so they can be imported like
# my_operator.py
from airflow.hooks.my_hook import MyHook
So I changed all the imports to read directly from airflow.plugin_type instead. I get another import error though, this time saying that my_hook cannot be found. I restart my workers, scheduler and webserver every time but that doesn't seem to be the issue. I've looked at solutions proposed in similar questions and they don't work either.
The official documentation also shows this way https://airflow.apache.org/plugins.html of extending the AirflowPlugin class, but I'm not sure where this "interface" should reside. I would also prefer a drag and drop option.
Finally, it clearly doesn't make sense for my code repo to be the plugins folder itself, but if I separate them testing becomes inconvenient. Do I have to modify my Airflow configurations to point to my repo every time I run unit tests on my hooks/ops? What are the best practices for testing custom plugins?
I figured this out by doing some trial and error. This is the final structure of my AIRFLOW_HOME folder
airflow
+-- dags
+-- plugins
+-- __init__.py
+-- plugin_name.py
+-- hooks
+-- __init__.py
+-- my_hook.py
+-- another_hook.py
+-- operators
+-- __init__.py
+-- my_operator.py
+-- another_operator.py
+-- sensors
+-- utils
In plugin_name.py, I extend the AirflowPlugin class
# plugin_name.py
from airflow.plugins_manager import AirflowPlugin
from hooks.my_hook import *
from operators.my_operator import *
from utils.my_utils import *
# etc
class PluginName(AirflowPlugin):
name = 'plugin_name'
hooks = [MyHook]
operators = [MyOperator]
macros = [my_util_func]
In my custom operators which use my custom hooks, I import them like
# my_operator.py
from hooks.my_hook import MyHook
Then in my DAG files, I can do
# sample_dag.py
from airflow.operators.plugin_name import MyOperator
It is necessary to restart the webserver and scheduler. Took me a while to figure out.
This also facilitates testing since the imports within the custom classes are relative to the sub modules within the folder plugins. I wonder if I can omit the __init__.py file inside plugins, but since everything is working I didn't try doing that.

Contents of INST not getting installed during R package install

I am having trouble getting the contents of my INST folder to copy over to the root folder of the package (or copy over in general)
My R package depends on two java packages and their jars, these are both included as follows:
PackageTest
+-- inst
| +-- impala-jdbc-cdh5
| +-- *.jar
| +-- sqlserver-jdbc-4
| +-- *.jar
+-- R
| +-- hello.R
+-- man
| +-- hello.Rd
+-- DESCRIPTION
+-- NAMESPACE
+-- .Rbuildignore
From my understanding the contents of the Package/inst folder should be moved to the root of the folder on install resulting in a folder structure that looks like this:
PackageTest
+-- impala-jdbc-cdh5
| +-- *.jar
+-- sqlserver-jdbc-4
+-- *.jar
+-- R
| +-- PackageTest
| +-- PackageTest.rdb
| +-- PackageTest.rdx
+-- Meta
+-- html
+-- help
+-- DESCRIPTION
+-- NAMESPACE
+-- INDEX
Instead I am missing the two folders listed at the top:
PackageTest
+-- R
| +-- PackageTest
| +-- PackageTest.rdb
| +-- PackageTest.rdx
+-- Meta
+-- html
+-- help
+-- DESCRIPTION
+-- NAMESPACE
+-- INDEX
There are other files in the other folders, but nothing that I am concerned with.
What I have tried:
I am using RStudio to generate this package and all I am doing is modifying the DESCRIPTION package to include the two Java packages. (so it really is a stub package with some Java jars - I have also imported rJava)
To build and install the package I have tried using devtools::build and devtools:install and the shortcuts that RStudio provides (Ctrl + Shift + B).
I have also tried to upload this repo onto our internal Git repo and use devtools::install_git and that results in the same issue.
Restructuring the inst folder to so that all the jars are in inst/Java has not helped with my issue.
Lastly, the package we have deployed on our production repo is bundled the exact same as the first example and we have no issues using the impala and sql drivers.
Any help would be appreciated. :)
EDIT:
.Rbuildignore
^.*\.Rproj$
^\.Rproj\.user$

TFS project structure makes simple things difficult

My team is currently working on an ASP .NET website. We are one of the first teams in our organization to use TFS2008 for source control. When I joined the project, it had already been active for a few months. Below is a diagram of the basic file structure we are using in TFS:
$/TfsProject/
|
| /* Contains our in-house class libraries. */
|-- Common/
| |
| |-- Extensions/
| | |-- Extensions.csproj
| |
| |-- Loggers/
| |-- Loggers.csproj
|
| /* Contains third-party libraries. */
|-- Library/
| |
| |-- EnterpriseLibrary/
| |
| |-- v4.1/
| |-- Microsoft.Practices.EnterpriseLibrary.Common.dll
|
| /* Contains the website itself. */
|-- Site/
|
|-- Packages/
| |-- Packages.csproj
|
|-- Website.root/
|
|-- Website/
|-- Website.sln
|
|-- Website/
| |-- Website.csproj
| |-- Default.aspx
|
|-- WebsiteUnitTests/
| |-- WebsiteUnitTests.csproj
|
|-- WebsiteWebControls/
| |-- WebsiteWebControls.csproj
|
|-- Utilities/
|-- Utilities.csproj
The main website solution (Website.sln) currently contains fifteen projects (including each of the .csproj files displayed in the diagram). Yesterday a decision was made that the projects contained in the Common directory should be moved into their own solution, and we should include them in the Website by referencing the compiled DLLs instead of the projects themselves. Anytime one of the Common projects is updated, all other projects that use it should begin using the latest version with minimal effort.
Is there any easy way to implement this, based on our current hierarchy? I have read the TFS patterns & practices guide, but implementing any its suggestions would require significant changes (as well as updating all of our projects and solutions). Also, our organization is waiting until TFS2010 is released before they enable Team Builds -- so they're unavailable to us.
The more "portable" solution would be to have a build specifically for the shared projects/solutions. The last step of those builds is to check in the binaries into a publishing folder (possibly under /libraries). When getting latest for the client projects (those referencing the binaries) you will end up pulling down the latest binaries. You don't lose the ability to branch the client projects and team members are free to map folders as they choose.
I will say as a whole, you should reconsider your folder structure. It doesn't allow for a very flexible branching structure. You appear to be using your TFS repository much like many VSS users historically have: As a versioned file system.
A solution that may work is to have
the projects in Common all output to
the same directory ("C:\temp\dlls"),
rather than each local /bin folder.
That directory can then be added to
the Web Project solution. All of the
references to the DLLS should be made
to the common folder pulled from TFS.
That directory can be added to TFS as
a folder. Everyone on the team will
have to have the folder mapped
locally with the same relative path
to the solution file.
The place where the "minimal effort"
piece breaks down is that the files
will have to be checked in/out when
compiled. The rest of the team would
then have to GetLatest. The GetLatest
requirement may actually be better,
because you don't want changes forced
to you while you are in the middle of
developing.
You basically end up having a folder with compiled dlls added to the web project solution. That is also the same folder that all the Common dlls are built to. That folder is where all the projects in the web solution reference the Common Dlls. When someon rebuilds, they have to check out the dlls in the folder, build and then check back in. When a developer wants the latest, they call GetLatest on the folder and rebuild their projects.
This actually worked for us in a similiar situation where we had compiled dlls to reference. The difference for us is that the compiled dlls chagned so infrequently, that the whole "GetLatest" paradigm never came into play.

Resources