SDDM Fails to Show, Requires Command Line Login First

The computer was showing a command line login. The SDDM login was failing to show.

Once logged in, if I waited a while and then pressed any key the SDDM login screen would appear. Username and password to be entered as usual.

Interestingly in the top bar it was the US layoutwhich was selected (no others available).

I also found that entering the wrong password I was returned to the login screen once more, where the gb layout was shown.

Through searches on the Internet I found this reference

https://forum.manjaro.org/t/sddm-very-slow-to-show-up/46361/5

Whilst not my answer, the included reference to this article

https://forum.manjaro.org/t/sddm-not-loading-properly-on-boot/34484

provided the solution.

The recommendation was to add the haveged package.

First install the package: apt get install haveged

Whilst the recommendation was to enable and start haveged:

systemctl enable haveged
systemctl start haveged

I chose to restart the conmputer to viewwhether the problem was persisting.

Problem solved! Albeit still with US layout.

Create Website Thumbnails from the Command line

Creating a new portfolio page for a website I was looking for a simple to use thumbnail creator.

The thumbnail creator should have the following criteria:

  •  minimum of steps, one would be ideal
  • easily repeatable
  • consistent output sizing

I wanted to avoid:

  • resizing the browser to specific dimensions for the next batch. Once closed reproducing the exact sizing from before is problematic
  • not reliant on an external reference, such as the old thumbnail service by Alexa, or the ShrinkTheWeb.

I knew that I could take a screenshot of the browser, either with one of the browser tools or a screen capture program. This could then be transferred to the Gimp for image editing and saving. Or maybe save the images and do a batch conversion with a tool such as XnConvert.

I wanted to keep the thumbnail creation both simple and easily reproducible.

Looking through my previous articles I one about taking screenshots from the command line using wkhtmltopdf.

neil@local:~$ wkhtmltoimage –width 1200 –height 1000 http://www.traditionalrestoration.co.uk traditionalrestoration.co.uk.jpg

It was easy to put in the website address, an appropriate file name and click return. Done within a few seconds I had a thumbnail of a website capture and saved. Replace the website URLs to work my way through the list.

Create websites from command line: traditionalrestoration.co.ukI did find that pages employing flash had empty spaces. For example this one from Lily Oakes’ website, which incorporates content from YouTube.:

Create websites from command line: lilyoakes.co.uk

Also one or two with sliders, which were taking longer to render, had missing content. These would be later visited using the slower more fiddly method with browser, screenshot and Gimp.

 

How to Redirect HTTP to HTTPS Using .htaccess File

With the emphasis on sites being hosted using https rather than http there’s also the redirect of the old website references, to the new encrypted https.

The .htaccess file in the root of the the web site supports the use of redirect rules.

Using htaccess file’s the Rewrite Condition and Rewrite Rule pairing we’ll redirect the http website pages to their corresponding https encrypted version.

Let’s start with the simple redirection.

We’ll begin by redirecting the non encrypted page to the encrypted website. For now we’ll ignore the finer point of serving up the same page.

RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP-HOST} [R,L]

In the above example I’ve assumed that redirects don’t already exist. So I’ve added the first action to turn on the rewrites.

This is followed by the condition which we are testing for and the rule to be applied if the match is successful.

In the above we are testing to see if the page is encrypted using https. If not then replace the current URL with the server host name.

The rule, as shown, has 3 parameters: the string to match; the replacement string and some flags for the web server.

In this example

  • (.*)  – take all of the characters in the URL
  • %{HTTP-HOST} is the old website server host server name.
  • [R,L] The R tells the web server (Apache) this is a temporary redirect. Whilst the L to stop any further processing if the rule matches.

Now to take this a step further.

Clearly, entering a page which you have visited before, only to find that you are viewing the root of the website, will not be conducive to a happy website audience. Your visitor won’t care so much that it’s encrypted. It will also affect your website rankings and listings.

Let’s add the extra part to maintain our viewed page.

RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R,L]

Now when the old page is visited it redirects to the encrypted one.

The final improvement is to tell the browser and search engine robots that this change is permanent by setting the value of R to 301.

RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

or

The last part of the line details the type of redirect.

  • 301 is a permanent redirect. change the simple R flag to R=301.
  • 302 is temporary.

It can be very difficult to force a browser to revert after setting a permanent redirect. For this reason it’s best to use a temporary one whilst testing.

Do not duplicate RewriteEngine On.

I’ve seen .htaccess files with a RewriteEngine Off. Make sure your redirect rule isn’t outside of these two tags.

But if when testing your redirect is found not to be working check that it’s within an active RewriteEngine section.

If you have more than one redirect rule in what order in the sequence should this https redirect rule be placed?

Put your https redirect early in your list of redirects.

It’s efficient to handle the specific rules early on. Every rule that is tested is additional server processing giving rise to a slower server and a slower site response. Clear and precise rules should be listed first.

Website Error 406 Not Acceptable

I like to run WordPress websites showing their URL as friendly URLs. The permalink setting of Post name (http://www.example.com/sample-post/) is selected for this.

However, on a new WordPress website after clicking on Save changes I was getting a screen showing a 406 not acceptable error.

406 not acceptable

I investigated making this setting manually, editing the file .htaccess.

The section relevant to the permalinks was empty:

# BEGIN WordPress

# END WordPress

I was expecting something more akin to this:

# BEGIN WordPress
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>

# END WordPress

As a workaround I added in the rewrite code and uploaded to the website. Refreshing the permalink page there was no difference. Reviewing the .htaccess file once more the content between the WordPress lines was once more empty.

Other content within the file was retained an actioned.

Searching the Internet for more details I found reference to ModSecurity.

And here it is on the control panel:

406 not acceptable :control panel modsecurity

Whats the default setting for ModSecurity? Is it active and can it be turned off?

To begin I clicked on the ModSecurity icon.

406 not acceptable :control panel configure modsecurity

In the top row click on the Disable button

406 not acceptable :control panel disable modsecurity

Confirm Disable all.

There you have it: Correcting the 406 Not acceptable error by disabling ModSecurity.

How to Disable KWallet Popups when Starting Programs

Opening some programs in KDE also causes the KDE wallet popup window to appear.

Indeed when I close Vivaldi browser it opens once more.

I have found that the programs which open the KDE wallet dialogue are: Vivaldi browser; Google Chrome browser; and Choqok micro blogger.

I wanted to find the method to prevent the opening of the KDE wallet popup dialogue window. I began by searching on the internet for ideas. However, I finally found the answer in the system settings.

Disable KDE Wallet Popups

Having clicked to open the programs the KDE wallet dialogue window opens.

The kwalletrc file

I found reference to the kwalletrc file.

The dialogue being controlled in the file

.kde/share/config/kwalletrc

Edits may be done either from the command line or by using an editor such as Kate.

Open the Dolphin file browser. This is in a hidden directory. To show it hold down the <alt> key and press the full stop. You can now navigate to the directory .kde/share/config.

Once there open the file kwalletrc for editing.

I chose to edit the file from the command line.

Sample contents of the file:

[$Version]
update_info=kwallet-4.13.upd:kde4.13

[Wallet]
Close When Idle=false
Close on Screensaver=false
Default Wallet=kdewallet
Enabled=false
First Use=false
Idle Timeout=10
Launch Manager=false
Leave Manager Open=false
Leave Open=false
Prompt on Open=false
Use One Wallet=true

To the above add a [Auto Deny] section with the programs which you wish to disable the use of KDEWallet. For example:

[Auto Deny]
kdewallet=Vivaldi
kdewallet=Choqok

The service will need restarting for the configuration file to be re-read. Its easier to do this from the command line.

To find the KDE wallet process(s):

ps -A | grep wall
3683 ?        00:00:00 kwalletd5
10228 ?        00:00:01 kwalletd5

Therefore to restart the service use:
killall -9 kwalletd5

Alternatively logging out and back in again will ensure that the service is restarted.

But, I found that this hadn’t corrected my issue.

I decided to look to see if there was another configuration file. I found

.config/kwalletrc

which had the following content:

[Migration]
alreadyMigrated=true

[Wallet]
First Use=false

I added the lines previously added, giving:

[Migration]
alreadyMigrated=true

[Wallet]
First Use=false

[Auto Deny]
kdewallet=Chromium
kdewallet=Vivaldi
kdewallet=Choqok

I restarted the kwallet service as previously

I had still failed to stop the popups.

System Settings

Really simple, I found reference to KDE wallet in system settings.

Open system settings, account settings and then disable the KDE wallet system.

Disable KDE wallet popups KDE wallet configuration

In the image above see checkbox – Enable the KDE wallet subsystem

I unticked the checkbox and this popup was shown

Disable KDE wallet popups allow saving wallet configuration settings

I entered my user password, NOT root.

I tried opening the programs again (Vivaldi and Chrome), with the same kwallet pop-up on opening and closing.

Left for further consideration. And did consider that maybe all will be resolved following  either a restart of logout/login.

When I next started the computer and tried opening these browsers no kwallet.

The opening of the kwalletrc dialogue box when opening some programs had been resolved.

 

 

Wi-Fi Authenticated and then Disconnected with DEAUTH_LEAVING

To connect a laptop to a Wi-Fi network I was using one of the USB devices.

I found that the network could be configured but the connection wasn’t being made.

To understand why the computer was failing to connect to the Wi-Fi iI looked at the syslog using

tail -f /var/log/messages

I found that it was authenticating and then aborting the authentication with the reason DEAUTH_LEAVING.

aborting authentication with 22:11:bb:ee:77:99 by local choice (Reason: 3=DEAUTH_LEAVING)

Here’s the messages log:

Jan 17 07:38:10 comp56 NetworkManager[447]: <info>  [1484638690.0736] Config: added 'ssid' value 'mywifi'
Jan 17 07:38:10 comp56 NetworkManager[447]: <info>  [1484638690.0736] Config: added 'scan_ssid' value '1'
Jan 17 07:38:10 comp56 NetworkManager[447]: <info>  [1484638690.0737] Config: added 'key_mgmt' value 'WPA-PSK'
Jan 17 07:38:10 comp56 NetworkManager[447]: <info>  [1484638690.0737] Config: added 'psk' value '<omitted>'
Jan 17 07:38:10 comp56 NetworkManager[447]: <info>  [1484638690.0769] sup-iface[0x80ccb680,aax000f79546295]: config: set interface ap_scan to 1
Jan 17 07:38:10 comp56 NetworkManager[447]: <info>  [1484638690.2228] device (aax000f79546295): supplicant interface state: inactive -> scanning
Jan 17 07:38:11 comp56 kernel: [  451.709320] aax000f79546295: authenticate with 22:11:bb:ee:77:99
Jan 17 07:38:11 comp56 NetworkManager[447]: <info>  [1484638691.1161] device (aax000f79546295): supplicant interface state: scanning -> authenticating
Jan 17 07:38:11 comp56 kernel: [  451.752904] aax000f79546295: send auth to 22:11:bb:ee:77:99 (try 1/3)
Jan 17 07:38:11 comp56 kernel: [  451.755677] aax000f79546295: authenticated
Jan 17 07:38:16 comp56 kernel: [  456.754570] aax000f79546295: aborting authentication with 22:11:bb:ee:77:99 by local choice (Reason: 3=DEAUTH_LEAVING)
Jan 17 07:38:16 comp56 NetworkManager[447]: <info>  [1484638696.1383] device (aax000f79546295): supplicant interface state: authenticating -> disconnected
Jan 17 07:38:26 comp56 NetworkManager[447]: <info>  [1484638706.1465] device (aax000f79546295): supplicant interface state: disconnected -> scanning
Jan 17 07:38:26 comp56 kernel: [  467.617419] aax000f79546295: authenticate with 22:11:bb:ee:77:99
Jan 17 07:38:27 comp56 NetworkManager[447]: <info>  [1484638707.0208] device (aax000f79546295): supplicant interface state: scanning -> authenticating
Jan 17 07:38:27 comp56 kernel: [  467.657640] aax000f79546295: send auth to 22:11:bb:ee:77:99 (try 1/3)

lsusb to find the chipset.

In my case it was RTL8188CUS.

Bus 001 Device 004: ID — Realtek Semiconductor Corp. RTL8188CUS 802.11n WLAN Adapter

I found a number of articles on the Internet which reference the GitHub rtl8192cu-fixes chipset driver by pvaret.

Here’s the one which I followed:

https://sites.google.com/site/easylinuxtipsproject/reserve-7

 

ERROR 2002 (HY000): Can’t connect to local MySQL server through socket ‘/var/run/mysqld/mysqld.sock’ (2)

On a new Linux installation I was getting an error when logging into MySQL using

mysql -u Username -p

The error was shown as:

ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)

A silly error to encounter. I found that MySQL server was not installed.

Having rebuilt the computer with a replacement hard drive numerous packages had been installed.

Installing PHPMyAdmin there were errors connecting to MySQL leading to the discovery that I was yet to install MySQL.

I installed MySQL using the command:

apt-get install mysql-server

With MySQL installed PHPMyAdmin worked and all was well. The Can’t connect to MySQL server error was resolved.

Create ISO Disk Image From Files

From a set of files on a computer I wanted to create an ISO disk image.

The ISO image, which I had, required some changes to a configuration file before could make user of it.

Having downloaded an ISO disk image, I wished to extract the files, edit one of the configuration files and to then save the set of files once more as an ISO disk image.

Extracting ISO Image

To extract the ISO image to a set of files I used Ark. Ark is able to manage a number of archive formats, including CD-ROM images.

I converted the downloaded ISO DVD image to a set of files, by right clicking on the image and selecting to extract.

After editing the option files, which needed changing, of the DVD installation I recreated the DVD.

Create DVD Iso-Image from Files

I chose to use genisoimage to create the ISO disk image from the edited files.

To create an ISO image from files within a directory is just as simple.

State an output directory and name of the ISO to create, along with a source directory. For example:old command was to use mkisofs

genisoimage -o /home/neil/example.iso /home/directory/

Save DVD as Files

Whilst investigating the option to extract the ISO image and to subsequently recreate it I found commentary regards extracting the files from a physical DVD/CD-ROM.

The command dd can be used for this.

Open a terminal window and type the following at the command line.

dd if=/dev/cdrom of=/directory/example.iso

The above command has the following arguments:

  • dd is the program used to convert and copy a file
  • if defines the name and location of the input file
  • of defines the name and location of the output file

References

genisoimage Debian package reference

genisoimage Debian man page

dd command line utility.

MythWeb Sub-Pages File not Found

Following a recent update to Apache I found that when I navigated from the MythWeb main page to one of the sub-pages, for example, to the television listings

http://192.168.0.1/mythweb/tv/list

I was seeing an empty page with the simple message:

File not found.

I drew the conclusion that because the main page was functioning the issue was related to the Apache rewrite rules and the handling of the querystring parameters for pages.

Looking the mythtv directory structure these directories and pages are missing implying that it is achieved by query string.

I was using an old configuration with a simple reference within the file:

/etc/apache2/sites-available/000-default.conf.

As a part of my changes I removed the directory reference to mythweb within this file and copied the file /etc/mythtv/mythweb.conf across to the directory/etc/apache2/sites-available.

To enable the use of the mythweb.conf file I issued the command

a2ensite mythweb

This first step didn’t resolve the file not found error.

Looking in the file /etc/apache2/sites-available/mythweb.conf

# If MythWeb is installed outside of the document root (eg. using Alias) then
# you will need to set this directive to the base URL that MythWeb is visible
# from externally.  If you do not, the web server will return 'not found'.
RewriteBase    /mythweb

# Skip out early if we've already been through rewrites,
# or if this is a /css/, /js/ or /cache/ directory request.
RewriteRule    ^(css|data|images|js|themes|skins|README|INSTALL|[a-z_]+\.(php|pl))(/|$)     -     [L]

# Redirect /pl/ requests to the perl cgi handler.
RewriteRule     ^(pl(/.*)?)$            mythweb.pl/$1               [QSA,L]

# Redirect most of the remaining URL requests to the main mythweb script.
# It will then handle any requests given to it.
RewriteRule     ^(.+)$                  mythweb.php/$1              [QSA,L]

# If you're experiencing trouble with the previous two lines in your copy of
# apache, you could instead use something like:
#RewriteRule     ^(pl(/.*)?)$           mythweb.pl?PATH_INFO=/$1    [L,QSA]
#RewriteRule     ^(.+)$                 mythweb.php?PATH_INFO=/$1   [L,QSA]

# Catch anything else that comes through and send it to mythweb.php with no parameters.
RewriteRule     ^(.*)$                  mythweb.php                 [QSA,L]It has the section suggesting that the default rewrite rules may not work under some circumstances.

To correct the issue I commented the two lines.

RewriteRule     ^(pl(/.*)?)$            mythweb.pl/$1               [QSA,L]
RewriteRule     ^(.+)$                  mythweb.php/$1              [QSA,L]

and uncommented the two lines

RewriteRule     ^(pl(/.*)?)$           mythweb.pl?PATH_INFO=/$1    [L,QSA]
RewriteRule     ^(.+)$                 mythweb.php?PATH_INFO=/$1   [L,QSA]

to give:

# If MythWeb is installed outside of the document root (eg. using Alias) then
# you will need to set this directive to the base URL that MythWeb is visible
# from externally.  If you do not, the web server will return 'not found'.
RewriteBase    /mythweb

# Skip out early if we've already been through rewrites,
# or if this is a /css/, /js/ or /cache/ directory request.
RewriteRule    ^(css|data|images|js|themes|skins|README|INSTALL|[a-z_]+\.(php|pl))(/|$)     -     [L]

# Redirect /pl/ requests to the perl cgi handler.
#RewriteRule     ^(pl(/.*)?)$            mythweb.pl/$1               [QSA,L]

# Redirect most of the remaining URL requests to the main mythweb script.
# It will then handle any requests given to it.
#RewriteRule     ^(.+)$                  mythweb.php/$1              [QSA,L]

# If you're experiencing trouble with the previous two lines in your copy of
# apache, you could instead use something like:
RewriteRule     ^(pl(/.*)?)$           mythweb.pl?PATH_INFO=/$1    [L,QSA]
RewriteRule     ^(.+)$                 mythweb.php?PATH_INFO=/$1   [L,QSA]

# Catch anything else that comes through and send it to mythweb.php with no parameters.
RewriteRule     ^(.*)$                  mythweb.php                 [QSA,L]

I saved the above file and restarted Apache using

service apache2 reload

Following the change to the file mythweb.conf and the restart of Apache I was once more able to navigate around the MythWeb sub pages.

KDE Add Extract in Right Click Context Menu

How to add the missing Extract option to the KDE right click context menu.

Looking at a recent Debian installation with KDE the right click on a zip file didn’t show the extract here option

KDE Dolphin right click context menu without extract

The above image shows the pop-up context menu when right clicking on the zip file.

As can be seen in the above image there is no option to extract the contents of the zip file.

I chose to install the packages zip and unzip

apt-get install zip unzip

But there was no change in the context menu

With the addition of the ark archive utility package

apt-get install ark

The context sensitive right click option to uncompress a zip file was then shown

KDE Dolphin right click context menu with extract

As can be seen in the image above, following the addition of ark, the right click context menu now shows the Extract option.