How to make HTML changes or customize module like poll in BigBlueButton html5-client? - meteor

I'm using bigbluebutton (2.3-dev) in Ubuntu 18.04 server I installed it using bbb-install (# wget -qO- https://ubuntu.bigbluebutton.org/bbb-install.sh | bash -s -- -v bionic-230-dev -s bbb.example.com -e info#example.com -a -w) and its work perfect.
Now I want to make some changes in html5-client (https://doamin/html5client/join?sessionToken=e)
I found the file path - /usr/share/meteor/bundle and it's served from this path /usr/share/meteor/bundle/programs/web.browser but problem is this is a build file so I can't make any changes because every time this file is new generator when stop and start or restart.
I want to add one link in left side menu (http://prntscr.com/umy63l). How can I do this and where I can do this?
Thanks in advance!

Did you install a dev environement for bbb-html5 ? You can find the doc about it here :
https://docs.bigbluebutton.org/2.2/dev.html#developing-the-html5-client

Related

Mount local volume accessible to R/RStudio in docker (tidyverse)

There are a ton of little-upvoted questions about how to address local folders from inside a docker container, but I can't find one that quite matches mine, so here goes another one:
How can I run a docker container, and mount a local folder so that it's accessible by R/RStudio, inside the container?
That sounds kind of like: mounting local home directory in Rstudio docker? and using an approach similar to that, I can start a container and mount a volume:
docker run -d -p 8787:8787 -v $HOME/my_folder:/LOOKATMEEE -e ROOT=TRUE rocker/tidyverse:3.4
and if I run a bash shell in the container, I can see the folder:
docker exec -it 38b2d6ca427f bash
> ls
bin dev home lib LOOKATMEEE mnt proc run srv tmp var boot etc init lib64 media opt root sbin sys usr
# ^ there is is!
But if I go connect to RStudio server at localhost:8787, I don't see it in the files pane, nor does it show up when run list.files() in the R console:
I'm sure I'm missing something basic, but if someone can tell me what that is... thank you!
In this circumstance, R and RStudio have a default working directory of /home/rstudio, two levels down from /, where I was telling docker to mount the folder.
After the docker run command in the question, you can go list.files('/') to see the folder.
If you want your folder to show up in the default working directory for R, as I do, then modify docker run like this:
docker run -d -p 8787:8787 -v $HOME/my_folder:/home/rstudio/LOOKATMEEE -e ROOT=TRUE rocker/tidyverse:3.4
and there it shall be:
Thank you to user alistaire.
This answer is for future generations :)
The concept is a "match" of the resource from the host with the container:
:
The command structure should be like this:
docker run -d -e PASSWORD= -p 8787:8787 -v
: /home/rstudio/ rocker/rstudio
Check the explanation here

Kaa docker deployment

I'm trying to install Kaa using Docker. I follow the installation guide (https://kaaproject.github.io/kaa/docs/v0.10.0/Administration-guide/System-installation/Docker-deployment/#deployment-process) step by step:
1- install docker
2- install docker compose
3- build kaa project locally
4- Run the following command from the server/containers/docker directory.
docker build --build-arg setupfile=kaa-node.deb -t kaa-node:0.10.1 .
Then Single node installation
1- Specifying the TRANSPORT_PUBLIC_INTERFACE parameter in the server/containers/docker/docker-compose-1-node/kaa-example.env file.
ip route get 8.8.8.8 | awk '{print $NF; exit}'
2- cd docker-compose-1-node/mariadb-mongodb/
At this point everything was going OK. But when I run the next command (3-) I get an error message at the end as shown in the image:
3- docker-compose up
enter image description here
Any help please. I need to solve this problem !!
it looks like you are referring to image tag 0.10.1 but actually have/need 0.10.0
If you haven't solved this one by now, it is possible that the documentation is NOT 100% correct.

how to update chroot with keyboard target after moving to usb drive

I have created a chroot using
sudo sh ~/Downloads/crouton -r precise -t unity
I did some config in the chroot and ran a -u update.
Then I moved it to a flash drive with
sudo edit-chroot -m ~/media/removable/MYFLASHDRIVE precise
where I can run it with -c /media/removable/MYFLASHDRIVE as per this issue
I now wish to add the keyboard target with
sudo sh -e ~/Downloads/crouton -n raring -t keyboard -u
but there is no option to modify the path (like -c for edit-chroot), and the issue above indicated there is no way to modify crouton's default chroot directory.
How can further targets be added to the chroot without moving it back off the usb drive?
I was able to make it work for me by symlinking to a directory on my external drive, and then running the commands as normal.
Back up your chroots, this worked for me, but I can't guarantee that it will work for you, or that it won't somehow delete your stuff.
1. Label your drive "external." Using a separate Ubuntu box is the easiest way. Install gparted through apt-get or the software store. Run that, make sure your external drive is selected in the top right hand drop down, right click your drive's partition and select "Label". Type "external", click ok, click apply.
2. Create a folder on your drive called "chroots". Move your chroot's folder into it.
3. Set up sym-link on your chromebook. Open a new chronos shell on your chromebook. Run these commands:
cd /mnt/stateful_partition/crouton
sudo mv chroots chroots.old
sudo ln -s /media/removable/external/chroots ./chroots
4. Run crouton commands as normal. You shouldn't need to specify -c on any of your crouton commands, you can just run them as if the chroot was installed locally.

Get list of files via http server using cli (zsh/bash)

Greetings to everyone,
I'm on OSX. I use the terminal a lot as a habit from my Linux old days that I never surpassed. I wanted to download the files listed in this http server: http://files.ubuntu-gr.org/ubuntistas/pdfs/
I select them all with the mouse, put them in a txt files and then gave the following command on the terminal:
for i in `cat ../newfile`; do wget http://files.ubuntu-gr.org/ubuntistas/pdfs/$i;done
I guess it's pretty self explanatory.
I was wondering if there's any easier, better, cooler way to download this "linked" pdf files using wget or curl.
Regards
You can do this with one line of wget as follows:
wget -r -nd -A pdf -I /ubuntistas/pdfs/ http://files.ubuntu-gr.org/ubuntistas/pdfs/
Here's what each parameter means:
-r makes wget recursively follow links
-nd avoids creating directories so all files are stored in the current directory
-A restricts the files saved by type
-I restricts by directory (this one is important if you don't want to download the whole internet ;)

How to change relative URL to absolute URL in wget

I am writing a shell script to download and display the content from a site and I am saving this content to my local file system.
I have used the following command in the script to get the content:
/usr/sfw/bin/wget -q -p -nH -np --referer=$INFO_REF --timeout=300 -P $TMPDIR $INFO_URL
where INFO_REF is the page where I need to display the content from INFO_URL.
The problem is that I am able to get the content (images/css) as an html page but in this html the links on the images and headlines,which are pointing to different site are not working and the path of the URLs (image links) are changing to my local file system path.
I tried adding the -k option in wget and with this option these URLs are pointing to correct location but now the images are not coming as the images path are changing from relative to absolute location. Without -k images are coming properly.
Please tell what option can I use so that images and the links in the page both come properly. Do I need to use two seperate wget commands one for images and another for links in the page?
As per the wget manual:
Actually, to download a single page
and all its requisites (even if they
exist on separate websites), and make
sure the lot displays properly
locally, this author likes to use a
few options in addition to -p:
wget -E -H -k -K -p http://site/document
In order to adjust it to your needs:
/usr/sfw/bin/wget -q -E -H -k -K -p -nH --referer=$INFO_REF --timeout=300 -P $TMPDIR $INFO_URL
I removed the -np because I think it's wrong (maybe a page dependency is in the parent directory).

Resources