Migrate from SBJson 2.2 to SBJson 5 - sbjson

Hello We are using SBJson lib with version pod 'SBJson', '~>2.2.3' and want migrate to latest version pod 'SBJson5', '~> 5.0.0'.
What will be effort to migrate. I mean will it support old methods or do i need find the deprecated or changed methods or any other specific change.
Does any one used this lib or done migration with SBJson do post to this thread.
Please suggest some solution

I don't know how complicated your usage is, but you will definitively have to do some manual edits. I released version 3, with a major shift in focus towards streaming JSON way back in 2011, and I'm frankly amazed version 2.x.x even compiles today!
I tried to document the changes in the repo's NEWS file. When naming the releases I tried to follow SemVer, which means that every 2.x.x -> 3.x.x and 3.x.x -> 4.x.x and 4.x.x -> 5.x.x were named thus because they all broke backwards compatibility in some way.
For version 3 the API went through a big change to focus on streaming. Much of the API changed to support this. Version 4 further removed old methods that had been obsoleted in the 3.1.x, 3.2.x and 3.3.x series. For 4 -> 5 the breaking change was very small: the API stayed the same, and only the behaviour is slightly different. ("Naked Scalars", e.g. strings and numbers not wrapped it an array or object, is now accepted as per the updated JSON RFC.)
Have you considered whether you actually need to use SBJson? You may be better off just using NSJSONSerialisation. After all it's been in the iOS SDK since iOS 5, and in the Mac SDK since a bit after that. The main benefit of using SBJson is if you want to be able to start parsing JSON before it's fully downloaded. (Thus improving perceived latency.)
Other issues you may run into: I fixed a lot of bugs since 2.x.x releases, which means the parser is much stricter now than it was back then. (Due to bugs it used to allow all sorts of broken JSON, and broken UTF-8.) So if you rely on any of that behaviour you will be out of luck. Hopefully that's not the case :-)

Related

Truth extensions causing rest of project to downgrade to guava android

If I add the com.google.truth.extensions:truth-proto-extension:1.1 jar to my bazel workspace, it seems to totally nuke the classes from com.google.guava:guava:28.2-jre, resulting in errors like
import static com.google.common.collect.ImmutableMap.toImmutableMap;
^
symbol: static toImmutableMap
location: class ImmutableMap
java/com/google/fhir/protogen/ProtoGenerator.java:316: error: cannot find symbol
.collect(toImmutableMap(def -> def.getId().getValue(), def -> def));
^
symbol: method toImmutableMap((def)->def[...]lue(),(def)->def)
location: class ProtoGenerator
Your documentation says
One warning: Truth depends on the “Android” version of Guava, a subset of the “JRE” version.
If your project uses the JRE version, be aware that your build system might select the Android version instead.
If so, you may see “missing symbol” errors.
The easiest fix is usually to add a direct dependency on the newest JRE version of Guava.
Does this mean anything other than the maven dep on com.google.guava:guava:28.2-jre? If not, what's the next easiest fix?
The key word here is "newest": You'll need to depend on (as of this writing) 30.1-jre. I have edited the docs to emphasize this.
(You can see the newest version in various locations, including: Maven Central, Maven Central Search, the Guava GitHub page.)
The problem is:
Some tools (including Gradle as well as the maven_install rule from Bazel's rules_jvm_external) pick the "newest" version of any given artifact among all versions found in your transitive dependencies.
Truth 1.1 depends on version 30.0-android.
30.0-android is considered to be "newer" than 28.2-jre (because 30 is greater than 28).
The -android releases lack the Java 8 APIs.
(So you can actually fix this by depending on any -jre version from 30.0-jre up: 30.0-jre is considered "newer" than 30.0-android because of alphabetical order. Fun!)
Unfortunately, the Maven ecosystem doesn't support a good way to offer 2 "flavors" of every release (JRE+Android). (People often suggest the Maven "classifier," but that does not actually solve the problem.)
For the future:
Gradle: Gradle is working with us to provide its own solution, but it's not quite ready yet.
Maven: Maven is unlikely to provide help. (It doesn't even try to pick the "newest" version, let alone support "flavors.")
Bazel: I don't know if rules_jvm_external (which uses Coursier) has any plans to support "flavors." (Editorializing a bit: In an ideal world, I would rather specify all my repo's transitive dependencies and their versions myself, rather than having the build system try to work it out for me. That can help avoid surprises like this one. But that brings its own challenges, and we've made only incremental effort toward addressing them in our own Bazel-based projects.)

What happened to sqlite MvvmCross GetConnectionWithLock?

I cannot find a patch note anywhere as to why GetConnectionWithLock is removed from MvvmCross.
It is still listed in the docs but not it no longer exists in the IMvxSqliteConnectionFactory in version 4.4.
Should I just switch to GetConnection now or GetAsyncConnection and should I manually use the lock(object)?
We switched away from using the SQLite.Net-PCL library from Oystein Krogh to use Sqlite-net-pcl from Frank Kreuger, because the former does not seem to get any updates and abandoned by the author. The switch was mainly due to lacking Android N support.
However, we are removing the SQLite plugin in next release, since there is not much point in having a light wrapper which only registers the plugin.

How can I upgrade my Realm Swift version from 0.96 to 0.97?

Can I just replace the two old version realm frameworks to the new version ones? Or what should I do?
Yep! If you're not using a dependency manager like CocoaPods or Carthage, you just need to delete the old framework folders and copy the new ones into the same place. Xcode should be fine handling that the next time you attempt to build your project.
If you are using a dependency manager, then you just need to hit the update command in their command line tools, and it'll be taken care of automatically.
Please keep in mind that Realm 0.97 has completely removed all of its previously deprecated APIs, so if you were using any of those, you will get build errors, but they'll be very easy to fix.
I cam up with the same question and while looking around came up with a good solution. This is in addition to what TiM has pointed out. Also, a few things to keep in mind:
I upgraded from version 1.0 to 1.0.1: so there weren't many changes to the framework and commands I used in my app.
I didn't use any special or very specific commands. Mainly the queries and writes/updates of objects. Nothing very fancy. If you have very specific requirements of Realm than I suggest look into those and see if there are any special changes to how they are managed.
Now to the steps:
Remove the frameworks from the "Embedded Binaries" section by clicking the "-":
General Tab - Embedded Binaries
Remove the frameworks from the project itself by right-clicking on them and select "Delete"
Navigator - Framework Files
Now just go and do the steps for installing the frameworks as found in the documentation "realm.io/docs/swift/latest/#installation-swift-22".
I understand this question is rather old, but looking through the SO I dint find a definite answer to this.
Hope this helped!

Enforce Javadoc version in SBT

I just started to work on a large project that uses SBT for building. I got a new computer with Java 8 installed, but the rest of the team still uses Java 7. That's not a problem as far as the code goes because we're all set to generate v7 byte-code.
The problems arise when attempting to publishing the project using the publishLocal action. Please keep in mind though that I am very new to SBT and some things I say/assume may not be accurate.
We use sbt 0.13. When I run the command sbt publishLocal, it runs the doc action, which in turn runs javadoc to generate the documentation. Since I have java 1.8 installed, it uses the corresponding version of javadoc, which let's be honest is a real pain in the neck, complaining about every single missing #return or #param, self-closing elements (e.g. "<p/>") and such, and returning a non-zero value because of this, thus making the publication fail. However, as I mentioned, the project is fairly big and, although it would be better to complete the javadoc documentation, it's not feasible at the moment.
Luckily, javadoc 8 provides an option do disable the pedantry: -Xdoclint:none will make it quiet about pretty much anything, allowing me to run the publish action by adding it to the javacOptions.
However, as I said, the other team members are still using java 7, and, unfortunately, javadoc 7 does not support that option, so if I push the build.sbt file with this option it will fail on other machines.
So now I'm wondering what I can do. The way I see it, there's a number of options, none of which seems "simple enough" to fix that stupid problem:
downgrade locally to java 7 (not a big fan of having two concurrent versions lying around)
have all other team members upgrade to java 8 (pain for them)
fix all javadoc problems in the whole project (pain for everyone)
Hopefully, there's another option I'm missing that would allow me, for instance, to set the javadoc options based on the java version? Or anything else that doesn't require touching anything else than the build.sbt file...
Thanks!
David
Yes, you can set the Javadoc options based on the Java version:
javacOptions in Compile ++=
sys.props("java.version").split('.') match {
case Array("1", n, _*) if n.toInt <= 7 =>
Seq()
case _ =>
Seq("-Xdoclint:none")
}

upgrading drupal 4.7 to 5.2

When I was upgrding my webite from drupal 4.7 to 5.2, I am facing an issue -
Fatal error: Call to undefined function node_get_base() in ../question.module.
Can any one help me to solve it...
Thanks in advance
The root of your problem is that you have some code in the file question.module calling a deprecated function (node_get_base()). Drupal 5.2 does not implement anymore that function so you get the error...
I assume that the question.module you are using is this one, which does exist for Drupal 5 series. If I am right, then I suspect you have missed to follow the upgrade instructions for updating your site which state:
5) Disable all custom and contributed modules.
[..]
11) Ensure that the versions of all custom and contributed modules match the new Drupal version to which you have updated. For a major update, such as from 5.x to 6.x, modules from previous versions will not be compatible and updated versions will be required.
12) Re-enable custom and contributed modules and re-run update.php to update custom and contributed database tables.
These instructions are in the UPGRADE.txt file in the Drupal root folder.
Hope this helps!
When upgrading Drupal to a new release, you more or less in some respects have to build the site over again. The thing is, that because the Drupal API changes to much in the new releases, each module will need to be replaced with a new one. In most cases this is just a matter of downloading a new version of the module and running update.php.
You should, however, go through each of your modules and find out what's needed to make the upgrade. Sometimes the upgrade path can be a bit tricky and you need a few attempts to get it right without corrupting any data in the process.
Another thing is, that when upgrading, going for the drupal 5 version of the same module might not always be the best choice. Especially when talking about Drupal 5, there are a lot of more or less unmaintained modules. So the best choice might be to find a different module that can do what you want, or even fulfill more of your needs.
It's always a hard decision and your theme will also need to be upgraded to Drupal 5. In short there is a lot of work evolved when doing a major upgrade. Most of it, is making some good choices about, which modules to use and how to migrate your data.
All that aside, following the upgrade guide in UPGRADE.txt like mac suggests is a very good place to start, and doing all of this leg work, would probably have avoided all this.

Resources