I have a bare Git repository. Let's say I have some content in branch master and the head is pointing at commit "C1".
test.txt at C1 contains
line-1
Now, I create a new branch ref R1 and add a new commit C11 with its parent commit being C1 (head of R1).
test.txt at C11 contains
line-1
line-2
line-3
I pulled master and see that a new commit C2 is added.
It contains test.txt as
line-1
line-2
Now, I want to merge C2 with C11 and create a new commit.
The issue I am facing is while merging.
I think this should auto-merge without any conflicts.
To merge:
Merger merger = MergeStrategy.RECURSIVE.newMerger(git.getRepository(), true);
merger.merge('C2', 'C11');
Here, the merge result is conflicted. But ideally, it should have been a non-conflicting auto-merge.
The JGit API for merging org.eclipse.jgit.merge.MergeAlgorithm#merge has the following input:
base
line-1
ours
line-1
line-2
theirs
line-1
line-2
line-3
What could be missing here / Anything special needs to be done while merging for bare repositories?
Thanks.
Related
I was trying to store multiple diff outputs in one .patch file to keep versioning in one file instead of running multiple
diff -u f1 f2 > f1.patch
commands. Preferably I'd keep running
diff -u[other params?] f1 f2 >> f1.patch
to have one file containing all changes which would allow me to later on run patch on those files to have a f1 file available in any given moment.
Unfortunately patch fails with file generated in such manner. It only seems to apply first patch from the file and then quits with error.
My question: is that possible with diff and patch? And if so, how?
Thank you in advance.
I want to remove wordpress core files from a lot of git repositories but keep the files so it doesn't break local
git rm --cached -r wp-includes/ wp-admin/ ......
Solves the problem, the question is how, on production server, git pull the commit WITHOUT deleting physically the files so it doesn't break the production ?
What the solution to this ? (commit both on production and locally, and since the commit will be the same, nothing will be physically ereased ?)
First, addressing your proposed solution: you will cause yourself unneeded trouble if you commit the same change in both places. The commits will not be the same from git's point of view (different SHA1 values) if anything is different; and even if you try to do everything perfectly, the timestamps will differ. I've messed myself up by forgetting the timestamps before, so trust me on this one. Once you have two different commits with the same changes, getting everything back in sync will probably be tedious, and there's a fair chance you'll end up accidentally deleting the files from production anyway, just to add insult to injury.
So instead:
I'm assuming that in the same commit where you delete the files you add them to .gitignore. (If not... well, it would be a good idea.) Even if you didn't, what I'm about to suggest will still work easily enough if there are no other deletes in the commit. (Bear with me, I think it will all make sense in a minute.)
So as you note, if you do a pull then the commit (including the deletes) will be applied; and that's bad. But you can use reset to "step over" those changes. I'll give the most general procedure. Suppose you have on production:
X --- X <--(master)
but you're behind origin; now instead of pull do
git fetch
giving you
X --- X <--(master)
\
A --- B --- C <--(origin/master)
^
deletes are here
If there is no A (i.e. the first newly-fetched commit has the deletes), then you can skip this step; but otherwise:
git reset --hard A
and now we have
X --- X --- A <--(master)
\
B --- C <--(origin/master)
^
deletes are here
Next we need to let git think that B is applied
git reset --mixed B
which leaves our work tree untouched. That means that git status would now show the inverse of each change from B as an unstaged change (unless (a) that change is a delete, and (b) a corresponding entry was put in .gitignore); more on that in a second, but our picture now looks like
X --- X --- A --- B <--(master)
\
C <--(origin/master)
If B had changes other than deleting the files, you need to remove the "undoing" of those changes from your work tree.
git checkout -- .
should apply anything except file deletes (which is good, because you don't want to apply the deletes - that being the point of this exercise). If there are other deletes, you'll need to reapply those manually. The git status output will lead the way (especially if you updated .gitignore, since in that case you'll know you're done when git status reports a clean work tree).
From here we just go back to normal approach, so if there is a commit C (i.e. there have been commits after the deletes):
git pull
and finally we have
X --- X --- A --- B --- C <--(master)(origin/master)
without the worktree copies of the deleted files having been affected.
Hello everyone,I am a newcomer.
I am learning OpenStack and kvm, but now I met a difficult problem:
I have a qcow2 image A,
a qcow2 delta image B whose backing file is A,
and a qcow2 image C whose backing file is B.
Now I want to merge B and C into a new qcow2 image D whose backing file is A.
I have tried to use qemu-img to solve it, but still didn't get positive solutions.
I hope you can help me, really appreciate.
With the vm in question currently running use a virsh blockpull.
virsh blockpull --domain vmname --path /var/lib/libvirt/images/c.qcow2
this assumes vmname is the vm using c.qcow2 which is backed by b.qcow2 which is backed by a.qcow2.
If you'd like a file other than c.qcow2 to be the final new complete backless file, then use the vm and create a d.qcow2 first and name it in the virsh command. This will leave a,b,c intact and pull a+b+c into d.
And yes, the domain is to be up and running while you do it.
cp C D
qemu-img rebase -b A D
This creates a copy of C called D and then rebases D on A.
Yes, it does work! Read here
Situation:
A <- B <- C
Task:
A <- D(B2&C2)
Solution:
A <- D <- B <- C
What you need to do, is:
Create a new - and therefore empty - snapshot for A.
Make copies of B and C as B2 and C2.
Rebase B2 to D.
Rebase C2 to B2.
Commit C2 to B2 and delete C2.
Commit B2 to D and delete B2.
What remains is D, containing B and C, with A as base. Done. This is possible because inserting an EMPTY snapshot into an existing chain of snapshots is possible.
I've merged my code with a branch for revision(say X), now there have been commits in the other branch and i want to take those changes in the merged code(which is not committed yet). Now can i run the svn merge command(to say revsion Y) on already modified files.
Assuming in my repo i've src files(say tmp1.c. tm2.c)
code :
$>svn merge rev#branch_out : orig_rev#merge url/to/the/branch/that/is/being/merged/
U tmp1.c
A tmp3.c
$>svn st
M tmp1.c
A + tmp3.c
Now the question is can i again do
$>svn merge rev#branch_out : new_rev#merge url/to/the/branch/that/is/being/merged/
I'm having this problem when I tried to extract information from excel files. Here's my situation, I have 34 Excel files which I received from my various users.
I'm using PHP version 5 to extract from the Excel files. My script will loop for every files, and looping again according to sheet name, and lastly looping again according to cell addresses.
The problem arised when the users had entered into a cell for e.g. =+A1 which means the users referencing the cell value to another cell due to it has the same value with cell A1.
When I checked in mysql (as I saved those for future use) I found from the record for a particular cell is identical with another record obtained from the same cell but in different excel file. What I meant is that, as my php script will loop from one file to another file, the first time PHPExcel read for e.g cell C3 which has some value USD3,000.00 the next files the PHPExcel may go to the same cell C3 but this time the C3 cell contain a formula that referencing to cell A1 ("=+A1" formula)which has value USD5,000.00.
PHP script suppose to record in mysql for USD5,000.00 but it didn't. I suspect that the PHPExcel script did not clear the variable at first round. I've tried unset($objPHPExcel) and destroy the variable but it still happening.
My coding is simple as follows:
if(file_exists($inputFileName))
{
$inputFileType = PHPExcel_IOFactory::identify($inputFileName);
$objReader = PHPExcel_IOFactory::createReader($inputFileType);
$objReader->setReadDataOnly(true);
$objPHPExcel = $objReader->load($inputFileName);
//to obtain date from FILE and store in DB for future comparison
$validating_date_reporting = $objPHPExcel->getSheet(0)->getCell('C10')->getValue();
$validating_date_reporting = PHPExcel_Style_NumberFormat::toFormattedString($validating_date_reporting,"YYYY-MMM-DD");
$validating_date_reporting = date('Y-m-d',strtotime($validating_date_reporting));
//first entry
$entry = mysql_query('INSERT INTO `'.$table.'`(`broker_code`, `date`, `date_from_submission`) VALUES("'.$broker_code.'","'.$reporting_date.'","'.$reporting_date.'")') or die(mysql_error());
foreach($cells_array as $caRef=>$sName)
{
foreach($sName as $sNameRef=>$cells)
{
$wksht_page = array_search($caRef, $sheetNameArray);
$cell_column = $wksht_page.'_'.$cells;
echo $inputFileName.' '.$caRef.' '.$cell_column.'<br>';
$value = $objPHPExcel->setActiveSheetIndexByName($caRef)->getCell($cells)->getCalculatedValue();
echo $value.'<br>';
if($value)
{
$isdPortal->LoginDB($db_periodic_submission);
$record = mysql_query('UPDATE `'.$table.'` SET `'.$cell_column.'` = "'.$value.'" WHERE broker_code = "'.$broker_code.'" AND date_from_submission = "'.$validating_date_reporting.'"') or die(mysql_error());
}
}
}
}
I really hope that you can help me out here..
thank you in advance.
PHPExcel holds a calculation cache as well, and this is not cleared when you unset a workbook: it has to be cleared manually using:
PHPExcel_Calculation::flushInstance();
or
PHPExcel_Calculation::getInstance()->clearCalculationCache();
You can also disable calculation caching completely (although this may slow things down if you have a lot of formulae that reference cells containing other formulae) using:
PHPExcel_Calculation::getInstance()->setCalculationCacheEnabled(FALSE);
before you start processing your files
This is because currently PHPExcel uses a singleton for the calculation engine. It is in the roadmap to switch to using a multiton pattern later this year, which will effectively maintain a separate cache for each workbook, alleviating this problem.
EDIT
Note that simply unsetting $objPHPExcel does not work. You need to detach the worksheets before unsetting $objPHPExcel.
$objPHPExcel->disconnectWorksheets();
unset($objPHPExcel);
as described in section 4.3 of the Developer Documentation. And this is the point where you should also add the PHPExcel_Calculation::flushInstance();