CefSharp with Any CPU fails to display browser - cefsharp

First up, I'm fairly new to .NET and C# and this is a project to learn C# and CEF at the same time.
I have followed a number of tutorials from the net as well as looking into the CefSharp examples to create a WinForms application.
I have installed CefSharp.WinForms 53.0.1 from NuGet, and my project is using Any CPU (CefSharp 51+ has Any CPU support).
To achieve this I largely followed the tutorial from Ourcode (http://ourcodeworld.com/articles/read/173/how-to-use-cefsharp-chromium-embedded-framework-csharp-in-a-winforms-application). I made the changes for Any CPU as suggested and included the basic code to load google.
Everything builds fine, but when the form displays there's no browser shown, just a blank form.
If I set the target to x64 or x86, then the browser displays as expected.
I notice in the Ourcode comments that user Edek Halon has had the same issue, but no solution seems to be provided. Edek, has the same setup as me, so I wonder if this is an issue in 53.0.1? Potentialy Joey De Vries in the comments has the same issue.
The addition of support for Any CPU in CefSharp is covered in this GitHub issue : https://github.com/cefsharp/CefSharp/issues/1714
There is a troubleshooting page for CefSharp (https://github.com/cefsharp/CefSharp/wiki/Trouble-Shooting) and it seems a bit contradictory. Under General Troubleshooting
1) Platform Target You must select either x86 or x64 when using the NuGet packages. If you select AnyCPU the NuGet magic won't work currently.
Does CefSharp need to be built from source for Any CPU to work?

Just if anyone is having difficulties with this, follow the Github Tutorial at the following link : https://github.com/cefsharp/CefSharp/issues/1714
Basically the code should be as follows (Winforms Example):
CefSharpSettings.SubprocessExitIfParentProcessClosed = true;
Cef.EnableHighDPISupport();
CefSettings settings = new CefSettings
{
CachePath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData), "CefSharp\\Cache"), //By default CefSharp will use an in-memory cache, you need to specify a Cache Folder to persist data
BrowserSubprocessPath = #"x86\CefSharp.BrowserSubprocess.exe"
};
Cef.Initialize(settings, performDependencyCheck: true, browserProcessHandler: null); // Initialize cef with the provided settings
chromeBrowser = new ChromiumWebBrowser("http://ourcodeworld.com"); // Create a browser component
this.Controls.Add(chromeBrowser); // Add it to the form and fill it to the form window.
chromeBrowser.Dock = DockStyle.Fill;

Related

Facing Cefsharp.BrowserSubprocess has stopped working after cefsharp version update

We have a VB windows application project using Cefsharp to display browser screen to display some data.
As part of Angular upgrade we have migrated our current cefsharp version from 75.1.142 to 98.1.21.
Reason for not updating to latest cefsharp version is when we did the update the application layout is completely affected, font size reduced, controls broken. So we forced to use lower versions.
We used nuget package manager to update the current version.
It is working fine in our local machine, but after deploying the changes to deployment machine we are getting this
Cefsharp.BrowserSubprocess has stopped working
A problem caused the program to stop working...
Please Note: when we compare the .csproj after the update to 98.1.21 we can see that Cefsharp.Winform is now missing in our latest changes under the imported project lists
Please Note: currently we are using cefsharp in our 2 projects and Target Framework used of these 2 projects are 4.6.2
This is what we have in our CefSettings initialization method
Dim settings As CefSettings = New CefSettings()
settings.remotedebuggingport = findafreetcpport()
settings.cefcommandlineargs.add("disable-gpu", "1")
settings.logfile = ceflogfilename
settings.logseverity = logsiviertity.error
cefsharp.cef.initialize(settings)
And my ChromiumBrowser is initialized like this
_chromeBrowser = new ChromiumWebBrowser(url)
_chromeBrowser.JavascriptObjectRepository.Settings.LedgacyBindingEnabled = True
_chromeBrowser.JavascriptObjectRepository.register("bond", new ChromiumWebBrowserEventHandler(Me),True,BindingOptions.DefaultBinder)
Please note: i haven't provided any BrowserSubProcesssPath during my cefsettings
And in my both .csproj files I have this
<cefsharpanycpusupport>true</cefsharpanycpusupport>
please let me know if you need to know how is my .csproj is configured with cefsharp dll paths.
we also tried different versions but same error is popping out
Any idea why i am facing this problem
Finally after many days of nail biting I managed to solve this problem by doing this steps..
Updated Cefsharp version with Cefsharp 107.1.121 (latest version was affecting our current windows application layouts. So we forced to stick with this version)
Moved LedgacyBindingEnabled = True from my cefsettings to ChromiumWebBrowser initialization section like below
_chromeBrowser = new ChromiumWebBrowser(url)
_chromeBrowser.JavascriptObjectRepository.Settings.LedgacyBindingEnabled = True
_chromeBrowser.JavascriptObjectRepository.register("bond", new
ChromiumWebBrowserEventHandler(Me),True,BindingOptions.DefaultBinder)
Modified cefsettings section by removing the LedgacyBindingEnabled = True code
Then set this broswersubprocess path to absolute (Important) one like this
Dim assemblyPath = System.IO.Path.GetDirectoryName(System.reflection.assembly.GetExecutingAssembly.Location)
If assemblyPath.EndsWith("\") == False Then
assemblyPath = assemblyPath & "\"
End If
Settings.BrowserSubProcesssPath = assemblyPath &
"Cefsharp.BrowserSubprocess.exe"
Hope it helps to someone who updates ur Angular version and stuck because of CefSharp version change.

Rendering Autodesk forge viewer offline on mobile freezes on chrome browser on android

I have hosted my model offline file on a local server and I am connected to the same server everything is accessible to me via android phone.
I have create a sample project where the forge render offline model file which works smoothly on chrome browser of my laptop but on my Xamarin form webview and chrome browser the model render with lags as my model has too many details and nodes, I cannot even perform a simple zoom in zoom out functionality.
Now same thing when I am running on Ios(safari browser) this works smooth and without any issue.
I want to understand is they any such setting which effects chrome browser or native browser of android which loads model with lag
I have tried all the possible solutions available on google
1.I have implemented this in xamarin forms custom webview renderer and gave all the required resource that I can
here are the few settings
var mWebView = new global::Android.Webkit.WebView(MainActivity.Main.ApplicationContext);
WebSettings settings = mWebView.Settings;
settings.JavaScriptEnabled = true;
settings.LoadWithOverviewMode = true;
settings.UseWideViewPort = true;
settings.SupportZoom();
settings.BuiltInZoomControls = false;
settings.SetLayoutAlgorithm(WebSettings.LayoutAlgorithm.SingleColumn);
settings.CacheMode = CacheModes.CacheElseNetwork;
settings.DomStorageEnabled = true;
mWebView.ScrollBarStyle = ScrollbarStyles.OutsideOverlay;
mWebView.ScrollbarFadingEnabled = true;
mWebView.SetLayerType(LayerType.Hardware, null);
SetNativeControl(mWebView);
2.Added hardware acceleration in manifest also added heap flag to true in manifest
3.followed this too for autodesk memory limit {https://forge.autodesk.com/en/docs/viewer/v7/developers_guide/viewer_basics/memory-limit/}
4.I tested above solution with firefox and it works smoothly I am not sure what different they are doing to render this files
can any one help me what should I try to solve this issue
My device is medium end with 8gb Ram and 128gb internal space I am testing on 2 devices (samsung m40 and one plus 7t)
Me to same Issue I am Unable To move Model in Android device. I am using the both online and offline Viewers.
I am Using Xamarin Forms Custom WebView Renderer as like Question
Autodesk team reply to us and Petr Broz solution plz
I'm afraid this (running a web application inside a Xamarin WebView) is beyond our area of expertise but I would suggest the following:
try running a vanilla three.js application inside the WebView, with a reasonably complex scene as well (for example, convert one of your Forge models into glTF using https://github.com/petrbroz/forge-convert-utils, and load the glTF)
if the three.js application has similar performance issues, it's most likely an issue on the Xamarin side, and something we won't be able to help with
if the three.js application is working fine, please send us (forge (dot) help (at) autodesk (dot) com) a basic Xamarin project with both Forge Viewer and with Three.js that can demonstrate the performance differences, and we would try and allocate some time for debugging it, and see if there's anything we can do to help

ASP.NET using AtalaSoft to convert Tiff compression

Using Atalasoft's free SDK,
http://www.atalasoft.com/free-dotnet-image-sdk
I added reference to the DotImage and DotImage.Lib dlls' to Visual Studio 2010.
My code:-
Atalasoft.Imaging.AtalaImage image = new Atalasoft.Imaging.AtalaImage(fileName);
Atalasoft.Imaging.Codec.TiffEncoder encoder = new Atalasoft.Imaging.Codec.TiffEncoder();
encoder.Compression = Atalasoft.Imaging.Codec.TiffCompression.Group4FaxEncoding;
image.Save(fileName, encoder, null); // destroys the original.
However when I run the code I get an error on the very first line:-
Unable to retrieve security descriptor for this frame.
Can anyone help me out with this?
Update:-
I added a further line of code:-
System.Security.Permissions.FileIOPermission f2 = new System.Security.Permissions.FileIOPermission(System.Security.Permissions.FileIOPermissionAccess.AllAccess, fileName);
Still the same error.
Philo,
Hi, I'm the support engineer you called in to yesterday. I apologize - after you called in, I received a note from our chief software architect asking us to help you out.
If you are still experiencing your issue, please do call back in and/or create a support case on our portal at https://www.atalasoft.com/support/my-portal/cases
A couple of tings that come to mind from your case: make sure you're targeting either x86 or x64 in your project's platform target (DotImage "has bitness") and make sure you're using the appropriate x86 or x64 Atalasoft references. (I strongly suggest our x86 while getting started as x64 has some additional hoops to jump through to get the licensing working.
Atalasoft does ship some AnyCPU dlls but they're for an extremely limited subset of use cases and if you have referenced those and/or are attempting to target your project to AnyCPU, this will cause all sorts of odd behavior.
Also, if you're targeting our .NET framework 4.0, make sure you're targeting the full framework and not "Client Profile" as DotImage has dependencies on components not present in the Client Profile version.
~DigitalSorceress
Did you have the the file with .lic extension in project section on the right side? Make sure about that.

Uploading big Files

I am an amateur ASP.net developer working on my first job (a friends website). ASP.net v4.0, using VS2010.
His company makes 3D models (using a 3D printer). The website is currently in development but can be found here. I would be the first to admit that the code has been a bit rushed and hacked in, but my friend is quite happy with what he has so far.
One of the requirements is that his customers need to be able to upload their model design files, which could be up to 100 MB each (or maybe more). I am struggling to get this to work properly.
I started by using the built in <asp:FileUpload ID="FileUpload1" runat="server" /> tag and an animated gif image, similar to the idea described in Joe Stagner's tutorial - thanks Joe, I like your presentations a lot.
This worked ok for a small test file but doesn't give any indication of upload progress. So I tried to improve my solution using ideas developed from Sunasara Imdadhusen in his code project article. My upload code looks like this:
Task t = Task.Factory.StartNew(() =>
{
byte[] buffer = new byte[UPLOAD_BUFFER_BYTE_SIZE];
// Upload the file in chunks so that we can measure how long it is taking.
using (FileStream fs = new FileStream(Path.Combine(newQuotePath, filename), FileMode.Create))
{
DateTime stopwatch = DateTime.Now;
while (stats.Uploaded < stats.TotalSize)
{
int bytecount = postedFile.InputStream.Read(buffer, 0, UPLOAD_BUFFER_BYTE_SIZE);
fs.Write(buffer, 0, bytecount);
stats.Uploaded += bytecount;
double dRate = UPLOAD_BUFFER_BYTE_SIZE / Math.Abs((DateTime.Now - stopwatch).TotalSeconds);
stats.Rate = (int)(Math.Min(dRate, int.MaxValue));
// Sleep is for debugging only!
//System.Threading.Thread.Sleep(2000);
stopwatch = DateTime.Now;
}
}
}, TaskCreationOptions.LongRunning);
Where stats is a reference to a class that is stored as a session variable, and is accessed from a javascript function running in a setInterval(...) (while the upload is taking place) by the PageMethod:
[System.Web.Services.WebMethod]
[System.Web.Script.Services.ScriptMethod]
public static UploadStatus GetFileUploadStatus()
{
UploadStatus stats = (UploadStatus)HttpContext.Current.Session["UploadFileStatus"];
if ((stats != null) && (stats.IsReady))
{
return stats;
}
else
{
return null;
}
}
This worked on my local machine (using the thread sleep to slow down the upload). So I published it to our host 123-Reg and it didn't work as expected. The animated gif comes on when the upload starts, but the progress bar doesn't start moving. The file input control and the submit button (in an IFrame) that should get disabled as soon as the upload starts take ages to become disabled. Then the web page just hangs there. After waiting for a while I clicked the page refresh button. When the page refreshed it showed that my test file had been uploaded successfully. I tried a 16 KB file and a 1.6 MB file.
Since this worked on my local machine, I suspect that this is happening because our website host (123-Reg) is using a web farm.
Anyway I thought this was going to be easy but it is not, and so I had a look for some open-source upload progress bars. I had a look at NeatUpload but it says "By default, NeatUpload won't work properly on a web garden or web farm", and it also says to get it to work on a web farm "Specify the same random 32-hex-digit decryptionKey attribute in the section of each server's Web.config". But I don't think I have access to 123-Reg's server config files(?).
My friend has suggested that perhaps we could use a service such as dropbox, but I had a look and I have no idea how to add this to our website. This might be a good option since they might be optimized for uploads.
Any advice or suggestions would be very much appreciated - thanks.
EDIT: The story so far ... still struggling with this.
I looked into using dropbox in detail (and other cloud storage providers such as SkyDrive etc). Seems that these services are designed around providing applications that interact with individual user's storage. I wanted all users to be able to upload files to MY dropbox folder but without sharing (customers should not have access to other customer's design files). Anyway I carried on, setup a dropbox account, installed the Sharpbox SDK and reprogrammed my upload code (the bit inside the task action). This seemed to work ok on my local machine, it was a bit slower because it was uploading to the dropbox server (I didn't need the Thread.Sleep). I published the website to the 123-Reg server and got a similar, unusable experience as before.
So far I had been testing on IE9, and I just happend to try it out with Chrome and I noticed a funny thing. Just after clicking the upload button but before the page refreshed Chrome showed an upload % complete dialog in the bottom left. This ran to 100%, then my controls started to show some action before the whole page started to hang again.
According to MSDN the HttpPostedFile:
"By default, all requests, including form fields and uploaded files, larger than 256 KB are buffered to disk"
So does this mean that my Task is being started after the painful part of waiting for the upload (which is actually happening between the client and a buffer)? If this is the case then it would not make sense to make it more painful by then sending the file off to dropbox right? And my progress bar is tracking the progress of the wrong bit?
(Today I am going to check the error handling of my code since I have a feeling that the reason it is hanging is because it is not recovering from an exception properly). It certainly feels like I am learning a lot by doing this.
This might be a bit overkill but you could look into using BluImp's jQuery File Upload tool (Demo site here: http://blueimp.github.com/jQuery-File-Upload/ ) - a developer called Max Pavlov has modified the original version (originally for non-dotnet technologies) for use in MVC 3.
It can be found here on git hub https://github.com/maxpavlov/jQuery-File-Upload.MVC3 . I have successfully implemented this in both MVC 3 and MVC 4 Beta. The only thing I had to do to make it work effectively in MVC 4 is remove some of the ClientDependancy (This is a DLL that handles bundling and minification of JS and CSS files) code as this replicates functionality already in MVC 4 but not in MVC 3. Additionally I have added some pages to the wiki over on GitHub describing what I did although its not complete. If you decide to go down this route I could update some details there with my more recent findings. My notes so far on MVC 4 integration can be found here https://github.com/maxpavlov/jQuery-File-Upload.MVC3/wiki/MVC-4---EnableDefaultBundles .
Incidentally I have managed to upload files up to about 2Gb using this tool and have found it quite flexible!

I want my DotNetNuke modules to work under as many versions as possible while avoiding assembly binding redirection

I am developing DotNetNuke modules and naturally want them compiled before installing or distributing them. In the past I've simply referenced a specific version of DotNetNuke.dll by browsing to the /BIN folder of a local DotNetNuke installation.
This reference allowed me to use the DNN base classes and create my own set of classes upon those. I also use various helper methods throughout the DNN namespaces/classes that I require. (i.e. Make derived classes from their PortalModuleBase, ModuleSettingsBase and use their Localization classes which replace those provided by Microsoft's ASP.NET implementation.)
I've been able to get away with this approach making that direct DLL reference (Copy Local = True, Specific Version = False) because until now I've been installing these modules onto client websites that I maintain. As such, I've kept them on at least the version of DotNetNuke I've been developing on - or newer. Most recently, I was referencing 6.1.3.108 in development.
NOTE: This automatically copies in the following associated DLLs into the /BIN directory of my modules:
DotNetNuke.dll
DotNetNuke.Instrumentation.dll
dotnetnuke.log4net.dll
DotNetNuke.Services.Syndication.dll
DotNetNuke.Web.Client.dll
DotNetNuke.WebControls.dll
DotNetNuke.WebUtility.dll
Installing this onto a DotNetNuke site of a NEWER version worked fine, which isn't a bad start.
What I've been wondering though, is if there is a non-hackish way of making my modules insensitive to the minor, build or revision levels of the DLL?
I realize that it makes it my responsibility to ensure the product (if developed on a "mid range" version) still works on slightly earlier as well as newer versions of the product. That said, I feel I can do thorough testing across those builds. To me this is preferential to having to run the OLDEST major build in development.
Put another way, I'd rather not develop with references to 6.0.0.0 just so it works on 6.x.x.x without extra effort. I'll only do that if someone doesn't have a brilliant way for me to make referencing say, 6.1.3.108 working on slightly earlier or later versions. (Naturally I'm okay with having to make a different module for major version changes, such as 5.x.x.x or 7.x.x.x.)
Thanks in advance!
Instead of referencing the assemblies in the bin folder, keep a copy of DotNetNuke.dll (and any other references) with your source code, and reference it there. Put the oldest supported version there, but develop on a newer site. Set Copy Local=False on the reference so you don't overwrite the newer version, and you should be fine.
In this way, we're able to reference DNN 4.5.3 while developing a module that runs on DNN 6.1.x. I've been using this method for years without any significant problems (except when I occasionally forget to turn off Copy Local and my DNN site mysteriously blows up).
In regards to determining the version of DNN in a class you've subclassed from a DNN one.
Here's what I would do, assuming YourClass inherits from DNNClass, but because you have referenced an earlier version of a property, 'NewProp' doesn't exist. Here's how to do it:
public class YourClass : DNNClass
{
public string NewPropSubstitute
{
get {
string newPropVal = "your default if earlier DNN";
System.Reflection.PropertyInfo pi = this.GetType().GetProperty("NewProp");
if (pi != null)
newPropVal = (string)pi.GetValue(this, null);
return newPropVal;
}
}
}
That's a made-from-memory guess so it might not compile, but you get the idea. You don't necessarily have to get the DNN Version if you want - just try and get the property through reflection - if it's there, implicitly you've got the right version.
Of course this method assumes you can substitute in a value for a later-DNN property (or method) if the DNN version doesn't support it. But that all depends on what you're trying to do.
If you do want to find the DNN Version (version safe and always correct) you can use the code for that which is embedded in my version-safe jQuery inclusion code, linked from this blog post:
Using jQuery in DotNetNuke 5 and 6

Resources