I have to deploy now my app and I was wondering to my self if is a good idea bring apps to production with the account-base package. Wouldn't be better to manage this with our own methods?
I think trying to manage it all yourself is a sure fire way to run into problems in the future. The concept of accounts seems simple, but the implementation never is especially when you start sprinkling in all the extra complexities (registration, resetting passwords, validation emails, 3rd party authentication, role based authorization, complete integration with your app, etc.).
At the end of the day, you dont gain much by spending days (or more) implementing and testing (hoping that you got everything right) that stuff when you could be spending​ that time on building the next big app. I'd much rather use tried and try packages that have verified by thousands of users across thousands of apps instead of trying to roll my own.
It depends on your purposes and preferences. Since account-base seems to fit well on my projects, I recommend you to get the most out of it and don't waste your time reinvent the wheel.
However, in the scope of meteor, if you want to make changes to the package, you can clone it to the packages directory and modify it your way.
The solution is create as many fields as I want and onCreateUser get them and save this on my ddbb
More info: http://docs.meteor.com/api/accounts-multi.html#AccountsServer-onCreateUser
Related
It may sound like a really trivial question, But i am developing an Android App with some friends and of course, we use GitHub to collaborate. However, for the server-side logic I am not sure where to store them for collaboration, Do I init a new repo only for those clound functions? Or store them together with my Android app source code?
Generally there is a huge discussion how to do such things. I think that you have to decide by yourself depending of your team structure and technologies used. Personally, once upon a time, I preferred to have them separately, however I then I started to work in huge project which was in one huge SVN repo and it was working good as well. Since than I am not so sure anymore...
In my opinion most important is to align with all team members regarding rules to make them all understand rules and follow the rules in the project.
I can provide you some links for interesting articles/discussions:
link1, link2, link3, link4
I hope it will help!
I'm starting to code some in-browser automated tests for our Shopify store, and I noticed that I've inadvertantly caused a massive traffic spike to our store during the time I was developing.
Is there a way to make a browser visit not count on Shopify analytics, like a "nostats" queryparam or something? I may eventually end up with dozens of tests running maybe a dozen times a day, and that'll make a significant difference to our analytics.
Right now I'm testing against a previewed theme deployed with themekit, so I'm not testing against the live theme.
I could create a dev store and copy over all our products/collections/etc, but I'd really rather test as close to the live store as possible. If that's stupid (or if there's a really easy way to make my dev store mirror my live store), let me know.
There's no way to disable Shopify analytics or stop collecting data in any manner you would like to do this. So, you would definitely need to use a development store to run your tests.
There's a number of apps available for store data syncing/migration. That's an easy option but might be quiet expensive. Depends on your resources though.
You can also create your own solution to sync the entities you need for testing. Not so easy but good if you would want to apply this process to multiple Shopify projects.
I was looking at some larger scale Meteor applications and was wondering why some of the initial sites do not seem to use meteor.
As an example when you go to classcraft and look at the main website you notice it is not using meteor.
Then when you go to their actual application (click signup for example) you can see it uses Meteor.
So they make a clear separation in terms of technology. Can someone explain the reasons? Is it not as efficient / clean to just use Meteor for the whole thing.
Thanks,
Jean
Each company makes their own decisions on how/when/where to use technologies. In the case of meteor, the really strong part of meteor is that it's real-time updating. That means things like messaging systems, getting updates out quickly, etc. good uses for meteor.
It appears as though classcraft has decided they don't need that capability on the home page. There's also some concerns with SEO and meteor that perhaps classcraft didn't want to deal with.
Finally the home page not being built in meteor shields the DB from public view, which is not a huge security advantage, but may be one they considered.
This is all me finding reasons for them as I don't know why they'd make that decision. I don't make that decision for my sites/apps but that doesn't mean others might not see things differently.
I'm the founder of Classcraft. To answer your question, it's because we didn't need everything Meteor had to offer for the front-facing website : reactivity, flexible templates, a database, etc. Meteor is amazing for building apps, but it's overkill for a static website. Also, if the front-facing website was built within the game app, it'd mean that any copy changes or tweaks to the front-facing would cause us to have to redeploy the app, which means some downtime (not much, but still) for our users. Keeping them separate also allows marketing people (who aren't developers) to tinker with it without going into the code base for the game.
We decided to build the front-facing website using middleman. Middleman allows you to generate a precompiled static website, which allows for amazing speed and simple server configuration (it's served from S3, which means it's super fast).
I'm sure the reasons are different for everybody, but that's what it was for us.
Shawn
I have been searching a lot for info and examples of the principles of making a simple multi-user web application.
The app i am going to make is used for deadline management and can be described as a simple calendar where users can register events.
I have no problem making this for a single user in PHP or ASP.NET, but how can i make this for multiple users, so they can register and only see their own data.
The app itself is pretty simple, and there will not be many users max. 50-100.
I find it hard to find info about this topic.
My own idea, which probably isn't the right way to do it is:
When a user creates an event, store it in a table with the user's ID.
When selecting data, use the logged in user's ID and get the corresponding event(s).
I would strongly recommend working within a framework in order to avoid re-inventing the wheel. If you know python, consider flask, pylons, or Django. If you would prefer to continue working in PHP (you should avoid working with ASP.NET if you are ever going to work with non-Windows developers) try Drupal. Ruby on RAILs has some options, as well, but I've never used it.
The way you are attempting to implement this is likely to lead to an oversized, overcomplicated database that is very hard for new developers to get used to. If you must implement this yourself, you should have a user/password table, an events table, and a table linking together the two (e.g assigning ownership).
Currently, we have a long list of various websites throughout our company's intranet. Most are inside a firewall and require an Active Directory account to access. One of our problems, as of late, has been the increase in the number of websites and the addition of a common code library that stores our database access classes, common helper functions, serialization methods, etc. The goal is to use that framework across all websites throughout the company.
Currently, we have upgraded the in-house data entry application with these changes consistently. It is up-to-date. The problem, however, is maintaining all of the other websites. Is there a best practice or way in which I find out versions on each website and upgrade accordingly? Can I have a centralized place where I keep these DLLs and sites reference them? What's the best way to go about finding out what versions are on these websites without having to go through each and every single website, find out the version, and upgrade after every change?
Keep in mind, we run the newest TFS and are a .NET development team.
At my job we have a similar setup to you, lots of internal applications that use common libraries, and I have spent the best part of a year sorting this all out.
The first thing to note is that nothing you mentioned really has anything to do with TFS, but is really a symptom of the way your applications, and their components, are packaged and deployed.
Here are some ideas to get you started:
Setup automated/continuous builds
This is the first thing you need to do. Use the build facility in TFS if you must, or make the investment into something like TeamCity (which is great). Evaluate everything. Find something which you love and that everyone else can live with. The reason why you need to find something you love is because you will ultimately be responsible for it.
The reason why setting up automated builds is so important is because that's your jumping off point to solve the rest of your issues.
Setup automated deployment
Every deployable artifact should now be being built by your build server. No more manual deployment. No more deployment from workstations. No more visual studio Publish feature. It's hard to step away from this, but it's worth it.
If you have lots of web projects then look into either using web deploy which can be easily automated using either msbuild/powershell or go fancy and try something like octopus deploy.
Package common components using nuget
By now your common code should have its own automated builds, but how do you automatically deploy a common component? Package it up into nuget and either put it on a share for consumption or host it in a nuget server (TeamCity has one built in). A good build server can automatically update your nuget packages for you (if you always need to be on the latest version), and you can inspect which version you are referencing by checking your packages.config.
I know this is a lot to take in, but it is in its essence the fundamentals of moving towards continuous delivery (http://continuousdelivery.com/).
Please beware that getting this right will take you a long time, but that the process is incremental and you can evolve it over time. However, the longer you wait the harder it will be. Don't feel like you need to upgrade all your projects at the same time, you don't. Just the ones that are causing the most pain.
I hope this helps.
I'd just like to step outside the space of a specific solution for your problem and address the underlying desire you have to consolidate your workload.
Be aware that any patching/upgrading scenario will have costs that you must address - there is no magic pill.
Particularly, what you want to achieve will typically incur either a build/deploy overhead (as jonnii has outlined), or a runtime overhead (in validating the new versions to ensure everything works as expected).
In your case, because you have already built your products, I expect you will go the build/deploy route.
Just remember that even with binary equivalence (everything compiles, and unit tests pass), there is still the risk that the application will behave somehow differently after an upgrade, so you will not be able to avoid at least some rudimentary testing across all of your applications (the GAC approach is particularly vulnerable to this risk).
You might find it easier to accept that just because you have built a new version of a binary, doesn't mean that it should be rolled out to all web applications, even ones that are already functioning correctly (if something ain't broke...).
If that is acceptable, then you will reduce your workload by only incurring resource expense on testing applications that actually need to be touched.