I am logged in as root using gitlab-api with Python. On the server, I have a git repository which consists of my template ( example ) code. This template code is however not under Adminstrator, but under my own namespace ( teacher/template) . I want to distribute this template project to 25 students as an assignment i.e fork this template to 25 additional projects, but in a different namespace/group such as StudentsGroup/assignment1, StudentsGroup/assignment2... and soforth.
Can anyone tell me what is the best way to achieve it?
In the gitlab api, I have seen two possibilities:
First option:
Admin fork relation
Allows modification of the forked relationship between existing projects. Available only for admins.
Create a forked from/to relation between existing projects.
POST /projects/:id/fork/:forked_from_id
Parameters:
id (required) - The ID of the project
forked_from_id: (required) - The ID of the project that was forked from
Second option:
Fork project
Forks a project into the user namespace of the authenticated user.
POST /projects/fork/:id
Parameters:
id (required) - The ID of the project to be forked
If you have gitlab-admin rights, the first option is at the moment the only way to achieve this.
But there is an issue (https://gitlab.com/gitlab-org/gitlab-ce/issues/21591) and a merge request (https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/6213) to add a namespace_id param to the second option. If we are lucky, the next gitlab version (8.12) will add this feature (gitlab normally releases each 22nd of a month).
As of today 06, Sep, 2015, no solution is available. So, I found a workaround.
Clone the Teacher/template.git into the local disk.
Begin a for loop:
Create empty Students/jobs_{count}.git in using the normal create project API
In the local disk, keep changing the remote origin in a script file, like git remote set-url origin git#server:Students/jobs_{count}.git
Push the files git push origin master
End of for loop
If anyone is interested in the script, you can ping me or will upload it later after removing the credentials.
Related
Most of our client projects have a very similar starting point:
Pristine AWS account used only for a single environment for this application
GraphQL API and a basic model to start with
A REST API with an OAuth handler endpoint and a generic webhook listener endpoint, each with a corresponding Lambda function (with code for each)
I've created this basic amplify app and I want to create a repo with this general structure as a starting point for future projects. The idea being that I'd copy the contents of the repo when starting a project and build from there.
So I've copied the entire amplify directory over to a new location, removing everything that is in the amplify section of the .gitignore file, and I have a folder structure like this:
My steps to start a new project are:
Create a new directory
Copy the template repo files into this directory
run amplify init
Once I do that, it creates two additional files:
amplify/.config/project-config.json
amplify/team-provider-info.json
Then, I try to do amplify push -- but I get No changes detected.
I'm not even sure if what I'm doing is possible -- I was wondering if anyone else has tried this?
I decided to migrate to a new github repo (not just changing the repo name) for a published deno module but cannot find a way to do it.
The manual says:
Module versions are persistent and immutable. It is thus not possible
to edit or delete a module (or version), to prevent breaking programs
that rely on this module. Modules may be removed if there is a legal
reason to do so (for example copyright infringement).
Does that mean the repository info is permanent and immutable too? I tried to use the same Webhook link in my new repository, but when I publish a new version, I noticed it didn't successfully trigger the update on deno.land/x/. The Webhook response is:
{"success":false,"error":"module name is registered to a different repository"}
Is it possible to change the associated GitHub repository link for a published deno module? And if so, how to?
To answer your question in case any future persons are curious. Luca Casonato (a deno core team member) clarified that you are just able to send an email:
Yes, send an email to modules#deno.com
Hope that helps!
I have VM which was not deployed through ARM and I would like to add some additional information to this VM (like tag for example). I can not find any examples how to approach this situation. Any links or explanation in nutshell how this shall be accomplished.
You would want to generate / export the existing RM template, you can do this using the following guide: https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-export-template
Go to the Resource Group
Click Automation Script
Alternatively, you can go to https://resources.azure.com to extract RM templates for existing resources.
Then, the next step is to modify it based on your needs, e.g. to add the tags.
I'm a bit of a newbie, but already running apps with Meteor.js. Since I'm now working with API keys I'm finally realizing that security is a thing, and so I placed my keys in a settings.json, and am instructed not to commit, or to .gitignore the file. But despite reading the documentation, this all seems very counter-intuitive. If I need the variables to make my HTTP requests, then how can my app possibly function without adding my keys, in some form, to the repo? I know the answer is "it can," but how? (in general terms, I don't need a Meteor specialist yet) .
Typing this question out makes me feel pretty ignorant for the stage I'm at, but the docs out there for some reason are not clarifying this for me.
You can generate the file with sensitive information on git checkout.
That is called a smudge script, part of a content filter driver, using using .gitattributes declaration.
(image from "Customizing Git - Git Attributes", from "Pro Git book")
That 'smudge' script( that you have to write) would need to:
fetch the right key (from a source outside the repo, that way no risk to add and push by mistake)
generate the settings.json, using a tracked manifest template settings.json.tpl with placeholder value in it to replace.
That means:
the template settings.json.tpl is added to the git repo
the generate file settings.json is declared in the .gitignore file and never versioned.
I think this might be an IIS7 permissions thing, but I'm tagging it with OpenWrap because I might be wrong. When I try to publish a wrap to an HTTP repository I get the following error:
PS C:\OpenWrapExamples\Ninject> o publish-wrap -Name Ninject -remote MyHttpRepo
# OpenWrap Shell 2.0.0.10
# Copyright © naughtyProd Limited 2009-2011
# Using C:\OpenWrapExamples\Ninject\wraps\_cache\openwrap-1.0.1.81349963\bin-net35\OpenWrap.dll (1.0.0.0)
Publishing package 'Ninject-2.2.0.85378492.wrap' to 'MyHttpRepo'
The repository OpenWrap.Repositories.Http.HttpRepositoryNavigator is read-only.
I've tried setting the permissions on the folder, but that doesn't work either.
If you just exposed an indexed-folder (one you added with file:///path/) as an IIS site, it will be read-only (as there's little we can do with that).
If you use OpenWrap 1.0, , you can simply add two remotes, one for the UNC path (so you can publish) and the other one for the http one (so you can read the content back).
If you use the upcoming OpenWrap 2.0.1, you can simply add both in one go
o add-remote http://server/ -publish file://server/path/to/share
If you want a repository writeable over HTTP, you can implement that feature yourself rather easily: have your index file at /index.wraplist, add an endpoint that support a POST with some content (that's the package), say at /upload, and add the following to your index.wraplist:
OpenWrap will then happily upload to an http endpoint.