nexus OSS 3.3.0-01 keeps falling over - nexus

My team has been using nexus oss version 2 (currently on 2.14.2-01) for a very long time without any hick-ups and we like it.
For another project we need support for Docker and decided to give nexus oss version 3.3.0-01 a go. Unfortunately, this falls over a couple of times per day and i don't know how to debug this. Are other people having the same problem with this version? Any suggestions?

My Team have the same issue in our nexus 3. he just falling in a random times, yesterday we noticed that the ram in the server is in 95% usage so we increase the ram (It's virtual machine). until now the nexus didn't fall but I can't guarantee to you that our problem will not come back.

Related

Is there an artifactory 7 release containing a fix for RTFACT-26825 (memory leak and timeouts using nuget-repositories)

In our company we recently upgraded to artifactory-pro:7.38.10 from version 6. To cleanup old artifacts we are using lavatory which runs an aql-search to identify the artifacts to be removed by filtering them by the date. This worked without issues our previous installation based on artifactory 6. Now after the upgrade artifactory frequently crashes with an OutOfMemoryError and the instance seems to require either significantly more memory than before or there is a memory leak. After further investigation it turned out that when problem is caused by running the aql-search and the memory usage jumps from 4 GB to over 10 GB. That's +6 GB for something that hasn't changed.
After searching for known issue I found https://www.jfrog.com/jira/browse/RTFACT-26825 which is resolved and might solve our problem but there is no version specified containing a fix. Since there is a workaround and the issue was fixed, I expect that there must be a release.
Is there already a release containing a fix?
The JIRA that you are referring to is fixed in the Artifactory version 7.38.0. So mostly, that should not be causing an issue as you are in higher version than 7.38.0.
In order to confirm, you may try the below. Add the system property to $JFROG_HOME/artifactory/var/etc/artifactory/artifactory.system.properties file and restart Artifactory for changes to take effect:
artifactory.nuget.v2.search.page.size=1000
Alternatively, you may put all the Nuget Devexpress repositories as offline. Now, check if you are encountering the memory issue. If you are not encountering this issue, maybe there is a regression issue. But according to my assumption, your server needs more resources as there are a lot of microservices introduced in Artifactory 7, when compared to Artifactory 6.
Please check if you are satisfying the resources requirement as mentioned in this page. In that case, you would need to tune your Artifactory as per this article.

How did my Artifactory generic and docker repos suddenly change type/version?

We have been running Artifactory (currently version 6.9.0) in EC2 for months now with no problems. This was originally a licensed instance of Enterprise Artifactory that we let lapse (intentionally).
Last week we started getting a storage warning (we use cluster-s3 storage) that we were at 95% utilization (which disables uploads) so we started cleaning up old artifacts (i.e., binaries, Docker images) to get the storage down. We got it down for a while, but it crept back up -- high enough this time that we couldn't ssh in, so we rebooted the machine via the EC2 Console.
It came right back with no obvious problems. Then we deleted a generic repository that someone had set up as a back up of another system (300GB) which bought us back plenty of space.
Today, a number of our builds started failing because the step to push the artifact to Artifactory failed. Upon further investigation, a number of our "generic" repositories are now appearing (and behaving) as "Docker" repositories. Further, a number of our v1 Docker repositories are now reporting as v2 Docker repos and blocking standard pushes from v1 clients.
The docs are pretty clear that we can't change the repo type, and I'm not seeing a way to migrate back to v1 from v2 Docker repos. I'm currently exporting one of the repos to see if we can import it as the right type.
Any idea what happened here? Did something get corrupted in the database? What can I even start to check?

Memory problem when using XCode4

Updated to xcode4 days ago, xcode4 is really nicer to xcode3. But I met a memory issue when using xcode 4. The total active memory kept growing when the xcode4 war running, grew from 500m to 2.4G, the process memory is around 200m. It's strange~
After I closed xcode, the total active memory didn't go down soon, it was 2.4G for about 10 minutes.
Has anyone else met this issue too? Thanks for any info!
== Updates ==
Upgrade to XCode4.0.2, still has memory issue
I have the same problem. At times Xcode 4 starts to index your project (you can see "Indexing" message in the status bar at the top of window). During that it could use up to 2.8GB (!) of memory.
As soon as it happens I stop to use my laptop and start to make tea :)
If the swap exceeds 500M I restart my computer. I have 4GB of memory installed in my macbook 5.2 and there is no way to increase it :(
I don't know exactly what that "indexing" actually means. I supposed that it is connected with Code Sense in some way. But when I tried to disable code completion (preferences -> text editing -> editing), it didn't help.
I hope Apple would fix it in the next release. If not, the only way is upgrade my computer. Or use Xcode 3.2.
I'm having this same issue. Currently I'm using following workaround:
I have Activity Monitor opened on a second screen, and whenever Xcode reaches 1GB I restart it, and it works smoothly once again.
I know it's far from perfect workaround, and I'm looking forward for a better one.
I have Xcode 4.0.1 & OSX 10.6.7
I found a solution!!!
I wanted to clean my /Library/Cache. Accidentally I deleted part of my /Library :-) so I decided to do a full system restore using OSX DVD and my current (20 minutes old) Time Machine backup. I did the restore and ... It fixed the problem!. Time Machine restore cleans all cache! (it should be enough if you only delete the content of /Library/Cache and {HomeDirectory}/Library/Cache). Good luck!

CSS cross browser compatibility on Ubuntu

I'm currently working in web development and my default desktop is Ubuntu and I'm kind of happy with the setup and applications I got going. But I need to test web pages for cross browser compatibility while still being on Ubuntu.
I have gone through hell trying to get IE7 or IE8 (with wine) to run on ubuntu and when they finally worked they were very buggy and the graphics/scrolling was insanely slow.
Of course there is the option of virtual box but again, too much GBytes just to run a small application!
So to all the CSS gurus out there, how can I continue with my beloved Ubuntu and still deliver a good quality (tested) page.
Thank you.
Edit:
Update for freshness:
I now use the paid service from browserstack.com to provide the multitude of different browser testing environments via flash tunnelling. I'm a paid user, but there is an initial free trial period. browserstack has freed me of the need to run the windows os on my machines in any form, virtual image or otherwise. Since it also allows tunnelling, I can host the site on my local machine but still test in browserstack browsers. I consider the monthly fee money very well spent.
End Edit
Various options I have tried, including "the final solution": free downloadable windows testing OSes from microsoft
I've tried a number of the options below, but virtualbox may be your best bet for full & complete testing, especially because in a professional capacity you often have to test ie8, ie7 -and- ie6. Which gets tricky with only a single os installed. So in order of simplest and most shallowly testing to most complex and most fully testing:
browserlab.adobe.com
A newer, interesting online solution is: browserlab.adobe.com. It's actually very specific and fast compared to browsershots. It only gives you screenshots, but it's a great first step. So I do recommend that for purely visual (and thus relatively shallow) testing.
Browsershots.org
And while browsershots.org is also something that you should use for an overview experience of what users might see, you really can't get by without the real browsers for javascript and behavior testing (instead of just display & rendering testing that browsershots provides). The delay before you can see the images is also killer.
Dual booting into windows
Another that I've tried is dual booting, I work 99% of my time in ubuntu, and I have windows installed & available to dual boot into. Not a fast way to test, but if you don't have any other way to access ie, it should work for at least the latest version.
Remote desktop-ing over to a running windows box
Before I mention the "covers-all-the-bases" option, another useful possibility is to set up a windows machine and boot it up and connect to it via remote desktop so that you can work from one machine and test from both.
The final solution, using virtualbox
Finally, the mother of all solutions, using virtualbox:
Luckily (I know you said you didn't like the virtualbox solution, and I know it's an annoying setup process, but...) Microsoft provides available-for-a-year-or-more virtualmachine distros with different versions of ie pre-installed, available without the need for a license for a year or so before you'd have to update the virtualmachine, #
http://www.microsoft.com/downloads/details.aspx?FamilyId=21EABB90-958F-4B64-B5F1-73D0A413C8EF&displaylang=en
Installing a virtualmachine from microsoft's freely available browser testing images
Because this guide to setup on ubuntu is no longer available in full anywhere else, just in case you or someone else actually need it I feel compelled to include the actual details of the install process that were suggested to me on the ubuntu forums and worked when I went through them. I apologize for their length. Courtesy of the now anonymous original poster on the ubuntu forums:
Free Access to Microsoft Browser Compatibility Virtual OSes, Install Steps for Ubuntu
http://ubuntuforums.org/showthread.php?t=1097080 (Ed: I can't find this thread online any more)
HOWTO: run IE6, IE7, IE8 on Linux in
VirtualBox You need: virtualbox, qemu,
wine
Code: apt-get install virtualbox qemu
wine
Download the free(!) Microsoft
Internet Explorer Application
Compatibility Check VPC Images here:
http://www.microsoft.com/downloads/details.aspx?FamilyId=21EABB90-958F-4B64-B5F1-73D0A413C8EF&displaylang=en
(Note: you don't have to download the
full pack, you can cherry pick
specific combinations of XP/Vista and
IE6-8)
Extract the VPC image(s) with wine
(double-click). (Note: it might take a
while before the first window shows
up)
Turn the VPC image(s) into (a) VMWare
image(s) (which is/are readable by
VirtualBox): qemu-img convert -f vpc
image.vhd -O vmdk image.vmdk
Setup a new VM in VirtualBox, using
the vmdk image as an existing disk.
Boot it, you will see the Windows boot
progress bar and ... it will BSOD
shortly after.
Fixing the BSOD:
The BSOD is caused because the virtual
Windows tries to load processor
drivers for the wrong processor (it is
not running on VirtualPC proc, but on
VirtualBox proc). Or something like
that... We need to force Windows not
to attempt to load drivers for the
processor (it doesn't need any proc
drivers, because it's all virtual
anyway). Start safe mode by
(frantically) hitting F8 at Windows
boot and choosing safe mode.
Ignore all the 'New hardware' detected
warnings (we will deal with those
later). Start a command box and run
the following command to disable the
loading of processor drivers:
Code: sc config processor start=
disabled (note the space between '='
and 'disabled'!)
Restart the virtual Windows, it should
now boot all the way to the Windows
Desktop.
Now just when you think you can start
browsing the web with IE, you will
find out that the virtual Windows
needs to install the drivers for the
AMD PCnet NIC, which are located on
the Windows install disk. Fortunately
for those without a Windows install
disk, there is another way :)
Download AMD PCnet drivers here:
http://www.amd.com/us-en/ConnectivitySolutions/ProductInformation/0,,50_2330_6629_2452%5E2454%5E2486,00.html
Make an iso file containing the
drivers. I used Brasero for
simplicity. Choose to create a Data
Project, add the zip file (or the
unzipped files, saves you a step in
Windows), create the iso. No need to
burn an actual cd!
Stop the virtual Windows, edit the
settings in VirtualBox: mount your
brand new iso.
Start the virtual Windows, when it
asks to install the drivers for the
PCnet nick, point it to the (unzipped)
drivers. Et voila! You have teh
innernets! (Now you can also try to
install the other drivers it complains
for, but it's not really necessary)
The image README says the image will expire after about a year. In my experience the system gets hobbled against multi-hour use, but is still usable for the kind of short periods that you might want when booting up to test a website. At worst you might have to go through these steps again, so be sure to put them somewhere where you can find them again after a year or so.
I think setting up a virtual machine (Virtualbox or VMWare or...) with a proper Windows will be your only (local) option.
I you don't have one, buy a used Windows XP license. XP is cheap (around 20-30 euros here in Germany, for example) and all relevant versions of IE run on it. Home edition is enough. No need for Windows 7 or anything.
You could install IETester on that to get all the IE versions on one OS. IETester has flaws and is not always 100% reliable in what it renders, but for a general CSS compatibility check it should be okay.
I've never tried IE using Wine, but even trying to imagine the combination gives me goose bumps :D
If you have a copy of Windows you could install it in a virtual machine (Virtualbox is a good, free option). Or if you don't mind a lot of lag time and publicly exposing your web pages you could use a service like BrowserShots.
I have not tried this on Ubuntu or anything but windows - but this seems to be a pretty good testing system over the web.
http://spoon.net/browsers/
however, I think your best result would be to use a VM if possible.
I have to add my voice to those opting for VirtualBox.
VMs are the only way to get an accurate representation of how IE platforms behave. They also allow you to keep your main Linux install free of WINE and IE gunk, which is otherwise always troublesome and fragile. (Especially if you're trying to run multiple IEs, which is unreliable and inaccurate even under Windows).
They're not necessarily that big, if you take care to prune the unneeded features, turn off swap, compact the disc image and so on. My XPSP3 test image is just over 800MB.
I didn't want to install all this stuff as I wanted to move forward quick.
I found public AWS images with pre installed browser that you just can start and use.
http://www.hens-teeth.net/html/products/cross_browser_testing.php
If you already have an AWS account this will take you only 5 min. Make sure that you enable the RDP port on the incoming traffic in your security group.
As I use ubuntu I was looking for a way to connect from it to MS Win.
I'm connection on to them via remote desktop.
The way to go here is rdesktop, a command line utility for Windows Remote Desktop. (sudo apt-get install rdesktop)
If you feel like a GUI use tsclient. It's very close to the windows version.
From a work flow perspective I develop for Chrome in Ubuntu first, then have a look at the other browsers via browserlab.adobe.com.
After that I start my new AWS instance to debug.
The small AWS Windows instance is a $0.12 per hour (http://aws.amazon.com/ec2/#pricing). I can work for a long time on that before it's worth installing all this stuff.
CrossBrowserTesting.com works from Linux. Allows you to access Mac, Windows, and Ubuntu configurations and all the browsers loaded on them via vinagre vnc client.

Running Visual Studio in Parallels for mac - problem with debugging sites sitting in os x drive

I've installed parallels desktop on my MacBook to be able to run Visual Studio 2008 in a XP installation. Everything works great except when I decided to put my websites in my sites folder in the os x file system (Which by default automatically happens because the My Documents folder is mapped to the Mac's Documents folder, and I'd rather put my code there so that both OS's can easily access it.).
When trying to build or debug I get this error:
Failed to start monitoring changes to 'Z:\xxx...'
How do I get it so that I can get it to work under Parallels, from the shared drive?
Parallels uses network drives to simulate folders on OS X, and Windows can't monitor changes to network drives, so if you do this directly, it'll be broken.
If you want to keep them in sync though, use Live Mesh (http://www.mesh.com) and install it on both the host and guest. A little roundabout, but it'll make it so both copies are maintained (and Live Mesh is handy for other things too)
I recently flipped over to putting my source code onto my Mac volume, so I could use Time Machine to back it up and immediately got this same problem with my ASP.NET app. Other, procedural applications, built just fine, by the way.
I tried all sorts of things, including using Samba on the Mac side to share the directory, which led into the "too many BIOS commands" error described elsewhere. Unfortunately for me, the Registry hacks to fix that problem never worked for some reason.
I finally found another solution that avoids Samba and just uses the regular Parallels Shared Folders. It too is a Registry hack, but this one simply turns off file change monitoring for ASP.NET. It is a bit heavy-handed, but gets my builds to work again.
The reference for this change is here:
http://support.microsoft.com/kb/911272
The downside to this approach, I am finding, is that you need to be more deliberate about recompiling, or restarting the web server, as changes during development don't just magically appear anymore. I am still deciding whether that is a useful tradeoff.
UPDATE: After several days of this, development was just too difficult and, sadly, what I reverted to was keeping my source inside the Parallels virtual disk. To enable Time Machine backups and Spotlight searches, I used a lightweight MS utility called SyncToy to push stuff out of Parallels and out to my Mac drive several times a day. Despite the high hack factor, it is working well.
I know this isnt strictly a solution but VMware fusion is superior when it comes to shared drive space on a virtual machine. Its what i currently use and hasn't let me down thus far...
People always give me odd looks when they see visual studio on my mac :P
Try moving the project on to the VMs C drive. Its not an ideal situation, but you can access the VMs C drive from OS X.
I have a similar problem with a php site that uses an MS Access database (its a clients system). I have alias's that point to the php site on the VM so that I can still do all of my coding in OS X. To do this I created a network share on the VM and then connected to it from OS X. Once connected make the alias's. If the network drive is not open and you open a file in OS X it will try to reconnect. It means the VM will need to be running to get to the files, but this isn't normally a problem since the VM is hosting the site anyways.
.NET has funny issues trying to debug the objects on a network drive.
make sure that you have full trust on your local network between your Mac and XP install.
Check out: http://msdn.microsoft.com/en-us/library/aa302361.aspx
If at the end of that research, I"m afraid you will have to look into the option of keeping it on the VMDisk and moving it when you need it.
I see a similar problem on my machine connected to the windows domain. My documents is mapped to a network share and I can't debug|run|etc. I had to eventually move to my local disk for debugging.
I definately recommend Live Mesh as a way to keep directories in sync. Just keep the VM's directory in sync with the Mac's directory.
Or use SVN to hold copies in both machines and do commit/update as appropriate. That way you get versioning, history and if your project grows bigger, you can share with other devs.
I know dropbox also has history and sharing, but not check in/check out/conflicts and all the other advantages of a real source control.
Oh, if you have money you can also go for TFS. I would but it is just too expensive :)

Resources