What are all Financial Industries - Standards, Protocols and Data Model? [closed] - standards

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
When I was preparing for TOGAF exam, I came across Open Standard available for DataModel, Service Model, Architecture for many industries.
(Example: ARTS has defined a data model for the Retail industry.
Energistics has defined a data model for the Petrotechnical industry.
FPL Defined fix protocol for trading.
SWIFT & ISO has some standard about Interbank Messaging.)
I would like to know from this community
What are all the OpenStandard - "protocol, datamodel, servicemodel" which you work with in Financial Industry? How many of them are really "Open" and widely used

Here are three widely used standards:
FIX
FpML
SWIFT
Most standards are "member-owned" by organizations like ISDA, SWIFT or FPL that you can join. Many standards are free as in beer, although the companies usually earn money with reference implementations and other services.

The underlying standard for Financial Services is ISO 20022. SWIFT, ISDA (FpML), FPL (FIX), and many others are all part of the ISO 20022 community.
ISO 20022 is broad in that it covers all of financial services - from Cards to Regulators.

Related

Definition of Data Engineering in Big Data [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I fully understand that this question may be closed as it can be more of an opinion than an actual technical question with an objective answer. However I want to ask it in case someone can help and provide a good response. I think it is important to define what you do in a succint way, so here it goes.
Q: If you are asked, "what's Data Engineering?" what would your definition be? (Not "what does a Data Engineer do?")
This one came to mind, but does someone have a better one? And I am talking in the context of Hadoop/Big Data
A:
Data engineering is the process of taking Big Data that is stored in
either a structured or unstructured format, processing it in batch or
real-time, and generating data in a new format that can be used for
further consumption, visualization, Machine Learning or Data Science
I would like to share what I think is a definition of Data Engineering related to Big Data:
Data Engineering supports and provides expertise to elaborate,
contruct, and maintain a Big Data. Data Engineering uses
tools, techniques, frameworks and skills that are essential to a good
"Data Infrastructure" or "Data Architecture" behind a Big Data.
A good way to define Data Engineering is understanting what a Data Enginer does. Here is a great infographic about: https://www.datacamp.com/community/blog/data-engineering-vs-data-science-infographic
Some responsibilities listed includes:
Develop, construct, teste and maintain architectures;
Ensure architecture will support the requirements of the business;
Develop data set process for data modeling, mining and production;
Recommend ways to improve data reliability, efficiency and quality.

what "concern metrics" means in Software Engineering [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Recently, I was reading a paper whose name is "On the Effectiveness of Concern Metrics to Detect Code Smells: An Empirical Study".
I come from a non-English speaking country, and I can not quite understand what Concern Metrics means in the field of software engineering.
It is not referring to the relationship between objects?
I have some understanding of java and c #, some people may be able to use java to give me an example.
Thanks.
Like it is said in the paper's abstract: "While traditional metrics quantify properties of software modules, concern metrics quantify concern properties, such as scattering and tangling." Are you familiar to the cross-cutting concern concept? This question provides examples of concerns: Cross cutting concern example Try to read papers on aspect-oriented programming (AOP) to grasp more concepts in order to understand better the relationship between concerns and code. The metrics are attempts to quantify, for instance, the amount of scatterness of a concern (e.g. login) over the source code.

What is the scope and benefits of big data? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
What can you do after learning big data related concepts, Hadoop, ML, NLP etc? Where can you implement these?
Not really a software related question - but it's very relevant to current technology and why some software exists. So here is an opinion.
We now live in a world where it is possible to monitor and digitally record information on an epic scale that continues to expand with concepts like The Internet of Things.
With this information it becomes possible to look at the evidence behind decisions that previously would have been made by gut instinct or opinions. What impact does road design have on traffic flow? Which medical drugs get best results in the real world and not just in drugs trials? Is there a correlation between office temperature and productivity? and so on for millions of questions in different domains.
Around the world, organisations are using data they never previously had, to get better at whatever they do (Good or bad).
The big data concepts are the tools for managing all this information. Big Data is not just large in volume, it is often unstructured and in different forms.
So to answer your question. You can implement these concepts by working with organisations that are using Big Data. Hopefully you can see the potential of Big Data along with the mind bending headache it can create when trying to make sense of it.

Does Ada really reduce bugs? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I tried googling around. I found one website that talked about how many bugs were found and how much it reduce testing. But it was only one page and I found no papers that talk about it.
Is there anything published to show the time or cost savings of using Ada?
There have been a few studies/papers, the only one that springs immediately is probably the most famous:
Comparing Development Costs of C and Ada (1995)
There is also a presentation:
Programming Languages and Lifecycle Cost (1997)
Other studies:
A Comparison of ADA 83 and C++ (1991)
Also of interest Ironsides, which is [AFAIK] the first verified DNS-server, has a couple of papers that mention the costs of buggy/insecure software and the benefits of formal verification.
I have seen one paper comparing the success rate of student teams in developing real time systems in either Ada or C. In that experiment, success rates were significantly higher in Ada; I'll leave the actual details to the paper.
It is "Software Engineering: On the Right Track" on this page

Can Scrum methodologies used for electrical engineering development? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 12 years ago.
Improve this question
I would like to know if Scrum methodologies can be used in electrical engineering development.
Scrum is just an iterative process, so it can be pretty much applied to whatever you like.
Yes.
I would like to know if Scrum methodologies can be used in electrical engineering development
Yes you can.You can do whatever you want as long as it benefits your organisation.
Pardon me if I am being presumptuous but are you really meaning to ask the below question?
I would like to know if Scrum methodologies would be beneficial in electrical engineering development?
Do you have the following? 1. a product or final product vision 2. a cross functional team who builds that product 3. product requirements which can be converted to bit sized shippable product user stories 4. and it is technically possible to work in as less as 2 weeks to 1 month iterations 5. a collocated team
If you have most of the above in your favor then it should work great for you, even if you don't, there is nothing that stops you from using Scrum in EE Development.
Can you use 'standard project management' methodologies in Electrical Engineering? If the answer is "yes", then you can use Scrum. Scrum is "a team-based framework to develop complex systems and products”. Scrum is not a 'silver bullet'; it will not magically resolve the issues the Team and Organization have. What it is is a 'silver mirror' that will reveal dysfunctions, and provide a framework and processes to help resolve them. The Scrum Alliance site has a wealth of information available; A good initialization might be http://agileanarchy.wordpress.com/2009/09/20/simple-scrum/
Many non-Software (and non-Engineering) groups are adopting Scrum.

Resources