Is there any Documentation to understand what tboot does and how tboot works?
http://sourceforge.net/projects/tboot/
TBoot is the reference implementation of a Measured Launched Environment in Intel TXT terms. In it's role as a MLE, TBoot can function as a boot loader and launch a whole operating system in this protected environment.
The Trusted Computing concept implemented is named late-launch or Dynamic Root of Trust for Measurement (DRTM).
What it actually does is described in Intel's Trusted Execution Technology: Software Development Guide. TBoot implements functionality of the entities called OS and MLE.
Related
I am injecting an object in a Blazor app as a singleton that is constructed using ILogger;
public MessageBroker(ILogger<MessageBroker> logger, IOptions<MessageBrokerConfig> config)
The app (.NET Core 5.0) crashes when I navigate to that page with;
Exception":"System.PlatformNotSupportedException: EventLog access is not supported on this platform.
I guess logging is one of the trickier functions to make cross platform because it is directly to the OS.
Does anyone know of an alternative to ILogger that will work on Mac Silicon, Windows, and Linux?
The problem is not in ILogger, but in its configuration. Apparently, it uses EventLog, which is used to write logs to Windows Event Log, not available on other platforms.
You most likely have something like that in your logging configuration.
logging.AddEventLog()
If you don't need to use Windows Event Log, just remove this line and use other logging providers instead. Console is the most basic one, just writes your logs in standard output.
If you are looking for more advanced scenarios, like logging to files or external log collectors, I can recommend Serilog or Log4Net, they should work on all platforms without an issue. You can find other alternatives in awesome-dotnet-core repo
If you really need to use Windows Event Log (e.g., your production server is Windows, but you develop on MacOS machine), you should probably wrap this line in an if statement and control it with configuration parameter
if (context.Configuration["UseEventLog"] == "true")
{
logging.AddEventLog()
}
I cannot create a new AccountInfo. I have set up the corda accounts library to version 1.0-RC01 and am running into:
java.lang.NoClassDefFoundError: com/r3/corda/lib/accounts/contracts/states/AccountInfo
Accounts SDK is still at the development stage. There is no guarantee of functional stability. We also did not release any official "versions" with the account SDK.
I've been successfully using Vivado and the SDK to develop VHDL and C for the Zynq XC7Z010 on a Digilent Zybo board. I've also been using the GNAT GPS IDE to learn Ada targeted to an STM32F4 processor (using one of the supported development boards).
GPS also ships with a set of zynq7000 run-times targeted to the XC7Z020 (as far as I can tell). Having looked through the BSPs for these target I believe that the code generated should also run on the XC7Z010 as the ARM cores appear to be the same. It may turn out that there are differences, in which case I will have a go at building a specific run-time based on the existing zynq7000 BSP (Adacore have documented this process and give an example for generating a new STM32F4 BSP).
My main problem is I'm not sure how to load and run the generated Ada elf file on my Zybo. I have tried to generate a BOOT.ini file containing a FSBL (built with the SDK and using my exported hardware from Vivado), a bit-stream and the Ada elf file (The the Zybo has an MicroSD interface that can be configured as a boot device, this works perfectly with a bit-stream and C elf produced via Vivado / SDK).
Anyway, this didn't work... I'm guessing that it might be a linking issue, or a boot loader issue, or similar. With my current level of knowledge I'm just not sure at this stage.
Any advice or pointers would be greatly appreciated!
It turns out that my BOOT.ini was fine, the problem was related to accessing custom AXI registers defined in my bit-stream. If I remove these references from the Ada the generated ELF file works perfectly. For example, printing over the Zybo's VCP using Text_IO.Put_Line(), using Ada run-time delay and Clock operations etc.
For some reason the AXI interface isn't working when I boot an Ada ELF file. If I substitute this for the equivalent C, then all is well.
This particular problem is currently unresolved, but not related to my original question!
(It might be that the Ada run-time is relying on the FSBL or u-Boot to have initialised this, not sure. Feel free to comment if you know, I'll also add a comment when I resolve this)
**** Update ****
Here is some additional background and a description of what I had to do to get my custom AXI IPs to work.
The provided AdaCore BSP (Board Support Package used to build the run-time) is targeted at the Xilinx XC702 development board. I'm using a Digilent Zybo (the older version). The two boards use different Zynq parts, the XC702 is based on a XC7Z020 and the Zybo uses a XC7Z010 (there is a new version with a XC7Z020 option).
I followed the AdaCore instructions (available on their web site) and built a BSP specifically for the Zybo. Initially I just updated the clock details as the Zybo runs at a different speed and then verified that the Ada delay function worked correctly (provided as part of the Ravenscar run-time built from the updated BSP). However, my custom AXI IPs still didn't work...
To cut a long story short, the Ada run-time contains as assembly file called start-ram.S that amongst other things sets up the MMU. There is an include file called memmap.inc that contains the actual MMU page definitions as a series of .long directives. I had to update the AXI_GP0 address entry by editing the particular directive to,
.long 0x43c10c16 # for 0x43c00000, axi_gp0
Previously it was set to 0x00000000 # for 0x43c00000, *none*. These entries are decoded within start-ram.S and then used to configure the MMU (the top 12 bits set the page and the remaining bits are chopped up and used as page config).
So, once I edited this file in my Zybo BSP and re-built the run-time, the IPs became accessible from the PS and worked as expected. This all took a while to figure out, but was worth it as I learn loads whilst exploring the dead ends!
I hope this helps someone in the future, I also highly recommend Ada for Zynq development especially if you ultimately need DO-178 certification, or similar.
I want to start using the framework Spring Cloud Contract for contract testing. But does Spring Cloud Contract support JavaScript and JMS?
I haven't found any information about this.
As for the JMS, we do support it either via spring-integration or Apache Camel. You can also write your own JMS support. It's enough to register a couple of beans.
As for Javascript and non-jvm languages. There's no out of the box support but we have a process for that. The workflow is described here (in those cases the consumer is a Java app but in the next section I'll describe how the flow would differ) - https://cloud.spring.io/spring-cloud-contract/spring-cloud-contract.html#_common_repo_with_contracts or https://cloud.spring.io/spring-cloud-contract/spring-cloud-contract.html#_step_by_step_guide_to_cdc. We will try to obviously simplify the process but currently there's a bunch of manual tasks to be done (not very tedious though).
The consumer can very easily download and run the stubs. Just clone https://github.com/spring-cloud-samples/stub-runner-boot, build it and push the fat jar to your Nexus/Artifactory. This application will be used by the consumers to automatically download stubs and run them locally. As a consumer you can then call java -jar stub-runner-boot --stubrunner.ids="com.example.groupid:artifactid:classifier:version:8090" --stubrunner.repositoryRoot="http://localhost:8081/artifactory/libs-release-local" . That way the application will start, download the provided jar with stubs from the given address where your artifactory is. Now your front end application can call the stubs of the producer at localhost:8090.
Of course we will try to simplify the cloning and pushing process (https://github.com/spring-cloud/spring-cloud-contract/issues/37) etc. but for now you have to do those 2 steps manually.
UPDATE:
With this article https://spring.io/blog/2018/02/13/spring-cloud-contract-in-a-polyglot-world we're presenting a way how to work in a polyglot environment. It's enough to use the provided docker images to run contract tests against a running application and to run the stub runners too.
I use qtscript in an application to provide automation capabilities for various functions within the application.
To allow greater flexibility i need the possibility to execute other tools (commandline commands/applications) from the script and get their output (the application itself is not security relevant - so calling random code may be ok).
Is their a way to do this with the basic qtscript module or some 3rd party class that encapsulate this or do i have to do this on my own?
A process can be spawned via QProcess class. It also provides console I/O capabilities to fetch executed process output (standard and error).
You will need to have a wrapper class however, since QProcess cannot be exposed directly to script environment (e.g. it defines no public slots accessible to a script).
See Related discussion on qtcentre forum.