JavaScript requesting file with inherited protocol - http

I have a titanium app that is embedding some third-party JavaScript code that is making a request for more files that it uses. The problem is that these file are being requested with the inherited-style protocol (//example.com/file.js) and it seems Titanium is attempting to reference this request as a local file. If I run the app in an iOS simulator and debug in Safari, the following console message is shown:
[Error] Failed to load resource: resource unavailable file://www.documentcloud.org/documents/2179503-superior-court-lawsuit-intersal-v-nc.js?_=1465333443448
Has anybody else ran into this issue or know of a way to solve it?

Please keep in mind, the final app is a native app. It is not possible to include external JavaScript files like it is in a browser/webview.
Yes, there still is JavaScript in the final, native, app. But it does not come with anything for you to load external Javascript.
If you want to include an external javascript file you will need to package it with the app.
It might also be possible to fetch an external Javascript file using the http client and storing it locally, however I have not tried this and do not recommend this.
In your case, just fetch the data as JSON by changing the .js in the url to .json: http://www.documentcloud.org/documents/2179503-superior-court-lawsuit-intersal-v-nc.json?_=1465333443448
You can then just fetch this with http client and use the data as usual

Related

How can we get Relay to work in production with NextJS?

I have a NextJS project using Relay. I have it working fine in development, but when I build, it is building static pages and is trying to access my GraphQL server (in dev it is pointed to https://localhost:3000/api/graphql), but I don't want it to since it should be a dynamic page.
With that, I also can't seem to get SSR working with Relay since a lot of functionality in Relay requires hooks and we can't use hooks in non React components (like getServerSideProps()). I got as far as using loadQuery from Relay in getServerSideProps but now my issue is that I need to get the Relay environment somehow, but again, can't use getRelayEnvironment() in there either. I import it from the createRelayEnvironment file but then I'm not using my App's environment (RelayEnvironmentProvider at the root of my App).
Anyone have success with using Relay in NextJS?
I don't know how I missed this, but I followed along with NextJS's example for using Relay Modern on GitHub.
I didn't do everything the same - I don't have a .babelrc file, for example, because that info is in the next.config.js file (thanks to NextJS 12.1).
What I really used from here was how they were starting and using relay in their relay.js file. Then I used that in a getServerSideProps function in my page just like how they did in their index.js file.

My single page website doesn't have API calls and url paramaters. Can I Implement Angular Universal with firebase hosting without using cloud function

I have an angular 9 application deployed in firebase. I am planning to implement Angular Universal SSR for SEO.
Could anyone please clarify my doubts regarding Angular 9 Universal + Firebase hosting.
(I am currently using Spark plan which does not includes cloud functions.)
Can I implement SSR without Firebase cloud functions ?
Also,
My website doesn't have API calls(other than Google Analythics on index.html), no database connectivity and no query string parameters. That means, my home page contents are always same**. In this case, can I use static server side rendering without cloud functions.
If this is possible, How to deploy the output to firebase ?
3.1, Will copy the dist folder contents to server works?
3.2 How to run SSR and non SSR version in my local?
Note: **My website is not static html page, I am providing some client side fuctionality using javascript/typescript code which does not have server calls.
Other than implementing Angular Universal, Is there any way to to achivbe SEO with Angular Apps?
This can be done via Angular pre rendering.
Prerendering does not requires cloud function or node support in server.
It is easy and Angular CLI version > 11 directly supports it.
Now Web crawlers can able to read the entire HTML content. Because the output of prerender is a rendered HTML file.
Steps
Install Angular Express Engine using the below command
ng add #nguniversal/express-engine --clientProject project-example
Run the below command
npm run prerender
Goto dist folder
Copy/deploy contents from the folder 'Browser' to server environment
Note that any code related to window, document, navigator will throw error and should be wrapped inside if condition like this.
if(isPlatformBrowser(this.platformId)){
// your code accessing window, document, navigator.
}

How meteor handles files with same name but different paths

In my Meteor project, Can I have app/client/abc.js AND app/server/abc.js and expect meteor to handle them graciously?
Yes. File names in meteor are not currently meaningful apart from manipulating load order. See the structuring your app section of the docs.
Yes you can have abc.js in both client and server folders. However, /client/abc.js will be loaded only on client, whereas /server/abc.js will be loaded only on server.

cloning a meteor app

Copying a static website, i.e., HTML, CSS, JS is very simple.
Copying a dynamic website, i.e., is difficult due to the server-side scripts.
I'm concerned about cloning any meteor app as most of the server-side scripts are eliminated and the only thing which needs to be copied is the database, the schema can be easily obtained from the meteor live app and data can be easily scraped from the existing meteor app.
If a successful meteor app can be easily cloned, no one would prefer to develop an app on meteor.
Is there a way to stop cloning an existing meteor app?
Well, technically a meteor app can be cloned it depends on your directory/file structure & whether you're using it in development mode. If you're using one file and this sort of structure to seperate your code:
if(Meteor.isClient) {
}
if(Meteor.isServer) {
}
Because this file would be sent down to the client so someone can fetch it.
So it might be better to move to this structure
/client - Place stuff in Meteor.isClient in a new js file
/server - Place your server side code in a new js file
/public - Place other public folder stuff
So no one will see the server side scripts, so they can't clone the backend of your app.
Production mode/Dev mode
In addition if you run your Meteor app in 'production mode' the Javascript is packed, handlebars & handlebars templates are precompiled.
In my opinion, it might be actually harder to copy a Meteor app to the previous types of web apps because HTML is rendered on the client side, fetching the html files will actually get back empty html files, if you even prettify the large JS file still leaves back precompiled handlebars templates. In addition files are merged into one!
So thats when it comes to cloning it to another meteor app. Even if getting the client script is available (as with any other stack) there are even more hurdles with Meteor when it comes to replicating the server script:
DDP
Attempting to clone it to a PHP/Server side script stack might be even harder because POST/GET aren't even used, DDP is used instead.
Schema
Width regards to the schema, you can control what the client sees via Meteor.publish, so they won't actually see the whole schema

Loading external xsd and dtd render my application long start up time

I'm developing a webapp using tiles and spring mvc. With the use of xsd and dtd validation on the definition of tiles and bean declaration of spring mvc, each time the web app is start/restart, then requests are sent to external server for xsd and dtd files. I notice that because my webapp failed to start casually due to failed request to external server (!!!).
I wonder if there is a way to tell my app to stop doing that? Like place a cached version of these files somewhere, or tell the Xml Processor to not valid these xml files at run time?
I'm facing a similar problem (but with xsd files.) After a little research, it appears that generally, foo-1.0.jar will contain foo-schema-1.0.xsd and therefore when foo goes to validate its foo-config.xml, it doesn't need to ask the Internet for the xsd.
The problem comes when you upgrade to foo-1.1.jar (which includes the new foo-schema-1.1.xsd) without changing your foo-config.xml to reference the new version of the schema. foo-1.1.jar doesn't contain foo-schema-1.0.xsd, so the parser looks for it on the Internet. If the site is trying to look at is down, you have problems.
So check your xml files to make sure they're referencing the version of the xsd/dtd appropriate for the jar version which is validating them.

Resources