With Winget command, how do you list only upgradeable apps - winget

There doesn't seem to be a built in option to filter by available upgrade or source.
I have tried this but it still lists everything:
winget list --source winget

I found a solution that works. You use the upgrade command with no arguments instead of the list command. This wasn't clear to me in the documentation but works perfectly.
winget upgrade

Try Running the command winget upgrade to get available upgrade
Fromthe response if you want to upgrade any then type in winget upgrade <id name>

Related

How to upgrade airflow?

There seems to be no proper documentation about upgrading airflow. The Upgrading Airflow to a newer version page only talks about upgrading the database. So what is the proper way of upgrading airflow?
Is it just upgrading the python packages to the newest versions? Or should I use the same venv and install the newer airflow version completely from scratch? Or is it something else altogether?
I'm guessing doing the database upgrade would be the final step followed by one of these steps.
I was also struggling with upgrading airflow for minor versions and didn't feel like I found a good answer in the docs. I think I have the right approach after looking back at how I installed airflow in the first place.
If you followed the guide to run airflow locally you'll want to change the value for AIRFLOW_VERSION in the commands to your desired version.
If you followed the guide to run airflow on docker, then you'll want to fetch the latest docker-compose.yaml. The command on the site always has the latest version. Then re-run docker compose up.
You can confirm you have the right version by running airflow version. I run airflow via docker so the docker steps work for me, I imagine the local steps should be about the same.
Adding to Vivian's answer -
I had installed airflow from PyPi and was upgrading from 2.2.4 to 2.3.0.
To upgrade airflow,
I installed the new version of airflow in the same virtual environment as 2.2.4 (using this).
Upgraded the database using airflow db upgrade. More details here.
You might have to manually upgrade providers using pip install packagename -U
After this, when I started Airflow, I got an error related to some missing conf. Airflow wanted the newest version of airflow.cfg, but I had the older version. To fix this,
Renamed my airflow.cfg to airflowbackup.cfg. This is done so that airflow will make a new airflow.cfg on start up when it sees that there is no config file.
Compared airflowbackup.cfg with the 2.2.4 config to find out all the fields I had changed.
Manually made those same changes in the newly made airflow.cfg

Unable to see snowflake conn_type in Airflow

I can see the following in my pip list but when I try to add a Snowflake connection via the GUI, Snowflake is not an option from the dropdown.
apache-airflow-providers-snowflake 2.1.0
snowflake-connector-python 2.5.1
snowflake-sqlalchemy 1.2.3
Am I missing something?
I have had this issue with MWAA recently.
I find that if I select AWS in the drop down and provide the correct snowflake host name etc it works though.
I run into the same issue using the official helm chart 1.3.0.
But finally I was able to make Snowflake connection visible by doing the following steps:
I uninstalled the apache-airflow-providers-google. Not sure whether this is important, but I like to mention it here. I did this, because I got some warnings.
Because with SQLAlchemy 1.4 some breaking changes were introduced, I made sure that version 1.3.24 gets installed. Based on that I choosed the fitting version for snowflake stuff.
So this is my requirements.txt for my custom Airflow container:
apache-airflow-providers-snowflake==2.3.0
pyarrow==5.0.0
snowflake-connector-python==2.5.1
snowflake-sqlalchemy==1.2.5
SQLAlchemy==1.3.24
This is my Dockerfile:
FROM apache/airflow:2.2.1-python3.8
## adding missing python packages
USER airflow
COPY requirements.txt .
RUN pip uninstall apache-airflow-providers-google -y \
&& pip install -r requirements.txt
I had the same issue where my pip freeze showed the apache-airflow-providers-snowflake yet I did not have the provider in the UI. I had to add the line apache-airflow-providers-snowflake to my requirements.txt file and then restart. Then I was able to see the Snowflake provider and connector in the UI.

Airflow upgrade 1.10.14 failed and cannot rollback to previous version which has the 1.10.10

I upgraded docker image to use airflow 1.10.14. Airflow is deployed with helm and I have an init-container which execute script to initialize airflow. The init script contain commands
...
airflow upgradedb
alembic upgrade heads
...
The upgrade failed so I need to rollback to previous deployed release version which contains the 1.10.10 version of airflow but it is now getting the alembic error. I tried to delete the row/record in the alembic_version table based on my search.
The error in scheduler container is this:
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.DuplicateColumn) column "operator" of relation "task_instance" already exists
All the other pods are running fine (webserver and workers).
Any resolution/workaround to this issue?
Unless you are ok with scrapping your entire metadata DB (connections, variables, task runs, etc) I might opt to just push to 1.10.15 and see if the bug you encountered is resolved there. From my best understanding it is not possible to downgrade the DB after the upgrade has been done.
Suggesting upgrade to 1.10.15 based on if you remember encountering an issue similar to this user here. The CLI fix can be found here. If you found another issue with your 1.10.14 upgrade besides the one I noted for the CLI, it might be worth investigating a path to resolving that instead.

DoctrineMigrationsBundle 3.0.1: metadata storage is not up to date

I recently got this error when running bin/console doctrine:migrations:migrate:
The metadata storage is not up to date, please run the sync-metadata-storage command to fix this issue.
However, running the sync-metadata-storage command yields the same error.
What can I do?
As mentioned in this GitHub issue one possible fix is to specify the MySQL server version in the server URL:
DATABASE_URL=mysql://root:#127.0.0.1:3306/test?serverVersion=mariadb-10.4.11
Then, you should be able to run the bin/console sync-metadata-storage command.
Read more about this configuration option in the doctrine documentation:
[…] you can pass the serverVersion option with a vendor specific version string that matches the database server version you are using […]
If you are running a MariaDB database, you should prefix the serverVersion with mariadb- (ex: mariadb-10.2.12).
I had to downgrade the doctrine/doctrine-migrations-bundle to version "^2.1"
Not sure if that applies here, but I had issues with doctrine lately aswell.
I did a composer update and ever since then, my project wouldn't run anymore. My issue was based on a new version of the following bundle:
https://symfony.com/doc/master/bundles/DoctrineMigrationsBundle/index.html
They restructred the doctrine_migrations.yaml file and I still had the old one. I tried to change the contents to the new 3.0 version but that lead exactly to your error.
Since the bundle comes with the package: symfony/orm-pack you first have to unpack the to be able to manually change the version inside your composer.json: composer unpack symfony/orm-pack
After unpacking you will see the following line inside your composer.json: "doctrine/doctrine-migrations-bundle": "*", which I changed to "doctrine/doctrine-migrations-bundle": "^2.1". Then i ran composer update again. You may specify only the migrations bundle if thats all you want to update.
Hello so i could fixe my issue just by removing (?versionname=5.7) to the database_url and it worked just fine

Arrow keys do not work with firebase init command

$ firebase init
! Caution! Initializing outside your home directory
? What Firebase do you want to use? (Use arrow keys)
I tried to press arrow keys but nothing is happening. How can I select an existing firebase app while doing firebase init?
When Git Bash let me down:
CMD did the job:
I encountered the same issue. Downgrading firebase-tools version to 6.0.0 worked for me. Currently, the latest version is 7.0.0.
Edit:
The Firebase CLI interface has changed my original response below no longer works.
The good news it that this now seems to work normally in the windows console. Tested with Windows 10, firebase-tools 3.0.8 and Node 6.10.0.
You can skip the initial radio box feature selection by specifying one or both of database or hosting after init. E.g:
> firebase init database
Using -P <project-name> to specify the project name doesn't seem to work for init though and it still prompts...
Original answer:
I just ran into this. Use:
> firebase init -f <name-of-firebase>
To get a list of all command line options:
> firebase init -h
I also was stuck with this issue yesterday .Today this has been fixed in firebase-tools version 7.0.1 .
Reinstall or update firebase-tools and hopefully the problem vanishes for you as well.
Update your NodeJs version to use arrow keys on Windows
Wasn't working while using the bash and cmd terminals. Only worked with the Powershell terminal
It should work if you use a newer vesion of NodeJS.
I use version 6.10.2 run in 'bash on windows'.
AND! I use 'windows creators update' to get the leatest version of windows 10.

Resources