We're building an information system for an organization, consisting of multiple microservices, web apps, and mobile apps, on .NET Core.
There are a bunch of settings, some of which are shared between parts of the system. These settings can be edited by the end-user in one of the system's web apps.
We want to make or make use of some centralized storage for these settings.
What I DON'T MEAN by settings:
Endpoint URLs, protocol versions, etc.
Keys, certificates, etc.
Database connection strings
These settings definitely must go to the configuration files, Azure Key Vault, Consul, ZooKeeper, etc.
What I DO MEAN by settings:
Organization name, address, phones, etc.
Web app content - label texts, block texts, images, logos, etc.
Business process settings - for example, "in which time of the day do you want to refresh data from the external service?"
In general - these are the settings, which hold business value and can be edited by the end-user of the system.
For me, it sounds a bit like Content Management System (CMS) - developers define, what parameters/values they need to build an app, and these parameters/values can then be set up during system exploitation. However, CMS is a very website-centric thing. It typically consists of pages, authors, comments, etc. We, in contrast, want to use some more low-level style of content, where we can define our own data types and store them.
So, my questions are:
Is it a good and common approach to separate business system settings from the rest of the system in microservice?
Is it ok to store business process settings, UI texts, organization requisites, and other settings all in one place, or is there a merge of concepts?
Is there any existing solutions for this task or do we need to build our own solution?
Related
I have a problem with unintended data sharing between different applications under same channel.
The problem is that is user adds items to a basket in one application, then that same basket can be viewed in different application that is not intended for the type of items that are in basket.
Below is the layout of applications under a single channel:
My question is whether is it possible to enforce different sessions for different application types (perhaps with a configuration setting) or whether there is some other built-in way of preventing data sharing between applications.
No. This is not possible to be achieved. By design, the Site is the session container of Intershop Commerce Suite. Within the Site different applications share the same storefront session.
I've been asked to develop a .NET web application with the following requirements and features:
Moderate software license expenses
.NET Web Application
Document storage (with change history, although a complete CMS is not needed)
Complex data model
Extensible and groupable object attributes
Private/public field visibility
Non-trivial relationships between database tables
Custom alert configuration (screen and e-mail notifications) about approaching due dates, missing documentation, etc.
Resource access control & user management (roles and groups)
High user volume (several thousands of users)
Many complex and dynamic forms
Search engine
Statistical reporting
Bulk data & metadata upload and download
Simple data migration
REST API for external integration
Multilanguage
Full-featured mobile version (for tablets and smartphones)
Corporate look and feel
These are the options I have considered:
SharePoint Foundation 2013 + Custom Web Parts + Custom DB + Document Libraries
Sense/Net + Custom Web Parts + Custom DB + Document Libraries
Custom ASP.NET Web Application
What approach would you recommend? Also, can you please make a recommendation on the following points?
Server characteristics and topology
Application architecture
Scalability
Search capabilities
Reporting tools
Persistence framework
Document storage (MS Office)
Mobility
First of all, I work for Sense/Net, which I want to put out there to be fair.
However, even if I wouldn't be, I'd still recommend looking at our solution based on the criteria you outlined. What you are planning to build seems to be really custom stuff and from experience, I can say that projects like this never changing. Going for an open-source application would definately be my choice, in order to make sure I don't hit a wall later down the line.
Sense/Net is practically capable of delivering everything you need out of the box, but of course, customization will be needed.
From a licensing perspective, you would also be better off probably, since we only lincense the CPU cores, not the thousands of users benefitting from the system.
Writing a custom application from scratch with these requierements would make no sense in my opinion as the costs would be well over the one of a readymade solution (whichever you choose).
The things need to be clarified are the reporting tools you will need and whether you need a native application for mobile devices (or would something working in their browsers would be sufficient).
I can see that this answer is well overdue, but if the topic is still of interest and you havent done so yet, drop us an e-mail through our website and we can help you out in finding the perfect solution!
As at 2013 what is the best practice for multi language multi culture localization of web pages.
Should I store all the translations in a database? for language etc?
Should I build resx files based on a database query or should I simply create resx files?
The application will be used with a browser interface across multiple devices and platforms e.g. Windows/IOS/Android. Are there any additional things I should think about.
best regards
Shrekito
There are a few types of global content you will usually have to deal with:
Application templates such as HTML pages, menus, and emails sent by your app
Dynamic content stored in your database
App templates
Templates are typically handled using your own framework's internationalization (i18n) framework. In the case of asp.net, the framework includes libraries to mark strings and export them to (English) .resx files. This supports a lot of built-in features such as variables, plurals, and developer comments. So unless you're facing some actual blockers, you typically want to go with it instead of building your own i18n framework (eg. in-database, or in-DB with .resx export).
Nowadays, the workflow decisions come when you actually want to get those files localized. In the old days you'd exchange the files with an agency using email. In 2013, web apps have a fast-paced development cycle with multiple releases per week (or per day), so there is a much bigger need for automation. Teams are switching away from using a VCS, FTP or Dropbox as a temporary storage for their files to modern localization (L10n) management platforms like Transifex (discl: I'm the founder) to manage their localization process.
The best way to remove the L10n pain from developers is to integrate your L10n platform with your build tools, so whenever you commit something, the English files are sent automatically to the L10n system, which will detect the changes and notify the right translators. When you're ready to deploy, your integration will pull the fresh translated files automatically.
With Transifex, you can use the Tx Client app with a git-commit hook, your build/CI system or your ol' good deploy script:
tx push --source
tx pull -l de,fr,it --mode=reviewed --minimum-perc=90
Dynamic and External content
This type of content is typically not handled using .resx files since it's easier to manage it in your DB itself. Typically, you want to store this content along with your template content so that translators have a single place to look at everything.
The platform you choose will need to have a modern API to work with. Check out the Transifex API on how these look like.
Other tips and tricks
Choose a platform which has a strong built-in web-based translation editor. Translators need solid features such as Translation Memory, Glossary, Machine Translation to deliver quality and consistent translations.
Follow good i18n practices such as good date formatting and avoiding concatenation.
Love your translators. They're the ones who make it all happen.
I'm creating a web application using asp.net & WCF as 3 tier architecture, which is mostly looks like a social website. Users can register with the system and they can upload their profile images, documents, video clips etc. So, what i want to know is what is the best way to store those files? In the wcf side or web application side ?
Also I want to know that, if i choose web application side to store those files as set of folders, how it makes those folders shared and allow access to another different project (such as a desktop client need to upload files into that shared folder) ?
thank you all in advance.
I think the question can better be put like this:
save in a folder in the web application or close by and have the metadata stored in a database
grab the saved images from a database via WCF
The second approach would likely be rather slow. Grabbing information over a service, convert it, use an httphandler with the correct mime type to spit out the binary stream to the browser...
Most architectures cut down in the middle: save the images close, or in, the UI layer and have the metadata about them stored in the database. Retrieval of that information's mostly just a bunch of strings so easily retrieved.
Update for the new question:
Since winforms applications/other projects were not in your original question this deviates into something new. In that case you go for some of the following scenarios:
Use the WCF tier as a common ground and store the images behind that service. As I said it's going to be an extra to pull the byte arrays over.
Store the images in the Web UI tier and have a service (asmx or WCF one) to expose the images to your winforms client.
Make a share for the winforms client on the server where the web ui runs, and where the images are. Of course be sure to be respectful to security and possible hacks.
It depends on what the most used scenario is. My assumption is that the web ui layer will be mostly used and the the winforms are going to be used for image manipulation? If so there are ASP.NET third party controls available for such manipulation as well so the need for a winforms client would decrease.
This depends on how big you expect this thing to get.
If this is for the wider internet and you expect it to get big, having it on the webserver will make it difficult to scale up your application by adding new webservers to your web farm.
One approach would be to have the physical files uploaded to the webserver, to make the uploads quick for users, and then have a coordinator background service that is triggered by an upload, perhaps using a FileWatcher. This service would propogate the file to all nodes in the web farm so that subsequent requests to other nodes will find the file.
If it is a small application intended only for within a company, on the web server is okay, with the following conditions:
You have full control over the hosting server so that you can set up the appropriate folder permissions.
You write your file saving and retrieving code in such a way that it can be moved onto the lower tiers without too much pain. Do it through an interface and inject the implementation
I have an intranet application that needs contact information for various locations on our campus that are served by our IT lab support organization. We have an enterprise directory that contains contact information so I'm not keeping the actual contact information in the database, but rather an immutable identifier that serves as a key to look the person up in our enterprise directory (via a web service). I'll be looking contact information up via a publicly available web site.
The problem comes in that the id that is useful to the web-based directory lookup is only "sort of" immutable and is not the id that I will store in the database. Directory lookups are most easily performed using the person's Active Directory login id. What I will be using is called the Master Records Unique ID.
My question is: where is the most reasonable place to do the translation from MRUID to Active Directory login id for the link?
Right now I'm doing the translation in the presentation layer, with application-level caching to reduce look ups to the directory. Currently there is only a single web site, but I would expect that if there are other web sites that need to do this, I would migrate the helper class to a shared web controls library.
I considered putting the code in the data or business layer, but opted not to because of the caching. How and what you cache seems to be more a function of the application rather than these other layers.
I'd be interested in other opinions and ideas that I may not have considered.
When faced with something that needs to be in the presentation layer of an asp.net web site or web application, but it also may have value in other asp.net web applications I find it useful to create a special class library that has a dependency on the system.web namespace.
Specifically, it will make use of HttpContext.Current to interoperate with the web site that is hosting the library. I'm not sure, but I generally think of this as a business layer assembly, but one that assumes it is hosted in a web context.
I keep my true business code (code that might be used in a non-web application) in a third assembly.
Having an assembly that depends on the web context allows you to use HttpContext.Current to find out what is going on with request and response objects as well as allowing you to interact with the asp.net cache API and related stuff. But it also keeps the code portable for use in more than one web application too.
Generally this web-dependent assembly is also where my HttpModules and HttpHandlers live too.
Keep in mind though that "layers" are logical concepts, not physical ones. There is nothing wrong with an assembly that contains business, DAL, and even presentation layer classes together. The classes themselves shouldn't mix up their roles, but a single assembly can contain classes from different logical layer in your design.
You could place it in your business layer and still use caching, either using the Enterprise Library Caching Application Block in the business layer, or by caching the value returned by the business layer in the ASP.Net cache in your presentation layer.
As it's coming from a different location to your other data I wouldn't put it in the same data access layer as the other database code.
I discussed this issue with some other developers at work and we decided that the presentation layer was the right place to do the translation. Consider the case where different applications that use the same business/data access layers want to translate the data in different ways. Unless we have a clearly defined business rule that states that individual identities shall always be displayed in a certain form, I think I'll leave it where it is and migrate it to a web controls library as needed to support multiple front-ends.