Densely Connected Networks

Open In Colab

Densely Connected Networks

In [0]:
import keras
keras.__version__
/usr/local/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
Using TensorFlow backend.
Out[0]:
'2.1.3'

Preloading the data in Keras

In [0]:
from keras.datasets import mnist

# obtenemos los datos para train y test 
(x_train, y_train), (x_test, y_test) = mnist.load_data()
In [0]:
print(x_train.ndim) 
3
In [0]:
print(x_train.shape)
(60000, 28, 28)
In [0]:
print(x_train.dtype)
uint8
In [0]:
len (y_train)
Out[0]:
60000
In [0]:
import matplotlib.pyplot as plt
plt.imshow(x_train[8], cmap=plt.cm.binary)
print(y_train[8])
1
In [0]:
import numpy
from numpy import linalg
numpy.set_printoptions(precision=2, suppress=True, linewidth=120)
print(numpy.matrix(x_train[8]))
[[  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   5  63 197   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  20 254 230  24   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  20 254 254  48   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  20 254 255  48   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  20 254 254  57   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  20 254 254 108   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  16 239 254 143   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0 178 254 143   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0 178 254 143   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0 178 254 162   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0 178 254 240   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0 113 254 240   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0  83 254 245  31   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0  79 254 246  38   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0 214 254 150   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0 144 241   8   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0 144 240   2   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0 144 254  82   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0 230 247  40   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0 168 209  31   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0]]
In [0]:
plt.imshow(x_train[9], cmap=plt.cm.binary)
print(y_train[9])
4
In [0]:
plt.imshow(x_train[10], cmap=plt.cm.binary)
print(y_train[10])
3
In [0]:
plt.imshow(x_test[11], cmap=plt.cm.binary)
print(y_test[11])
6
In [0]:
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')

x_train /= 255
x_test /= 255
In [0]:
print(numpy.matrix(x_train[8]))
[[0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.02 0.25 0.77 0.   0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.08 1.   0.9  0.09 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.08 1.   1.   0.19 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.08 1.   1.   0.19 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.08 1.   1.   0.22 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.08 1.   1.   0.42 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.06 0.94 1.   0.56 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.7  1.   0.56 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.7  1.   0.56 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.7  1.   0.64 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.7  1.   0.94 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.44 1.   0.94 0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.33 1.   0.96 0.12 0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.31 1.   0.96 0.15 0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.84 1.   0.59 0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.56 0.95 0.03 0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.56 0.94 0.01 0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.56 1.   0.32 0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.9  0.97 0.16 0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.66 0.82 0.12 0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]
 [0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.
  0.   0.   0.   0.   0.  ]]
In [0]:
x_train = x_train.reshape(60000, 784)
x_test = x_test.reshape(10000, 784)


print(x_train.shape)
print(x_test.shape)
(60000, 784)
(10000, 784)
In [0]:
print(y_test[0])
7
In [0]:
print(y_train[0])
5
In [0]:
print(y_train.shape)
(60000,)
In [0]:
print(x_test.shape)
(10000, 784)
In [0]:
from keras.utils import to_categorical

y_train = to_categorical(y_train, num_classes=10)
y_test = to_categorical(y_test, num_classes=10)
In [0]:
print(y_test[0])
[0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
In [0]:
print(y_train[0])
[0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
In [0]:
print(y_train.shape)
(60000, 10)
In [0]:
print(y_test.shape)
(10000, 10)

Model

In [0]:
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import sgd

model = Sequential()
model.add(Dense(10, activation='sigmoid', input_shape=(784,)))
model.add(Dense(10, activation='softmax'))

model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 10)                7850      
_________________________________________________________________
dense_2 (Dense)              (None, 10)                110       
=================================================================
Total params: 7,960
Trainable params: 7,960
Non-trainable params: 0
_________________________________________________________________

Definition, training and evaluation

In [0]:
batch_size = 50
num_classes = 10
epochs=10

model.compile(loss='categorical_crossentropy',
              optimizer='sgd',
              metrics=['accuracy'])

model.fit(x_train, y_train,
          batch_size=batch_size,
          epochs=epochs,
          verbose=1
          )

test_loss, test_acc = model.evaluate(x_test, y_test)

print('Test loss:', test_loss)
print('Test accuracy:', test_acc)
Epoch 1/10
60000/60000 [==============================] - 1s 23us/step - loss: 2.0126 - acc: 0.4508
Epoch 2/10
60000/60000 [==============================] - 1s 19us/step - loss: 1.5535 - acc: 0.6985
Epoch 3/10
60000/60000 [==============================] - 1s 19us/step - loss: 1.2442 - acc: 0.7568
Epoch 4/10
60000/60000 [==============================] - 1s 19us/step - loss: 1.0335 - acc: 0.7935
Epoch 5/10
60000/60000 [==============================] - 1s 20us/step - loss: 0.8872 - acc: 0.8200
Epoch 6/10
60000/60000 [==============================] - 1s 20us/step - loss: 0.7813 - acc: 0.8393
Epoch 7/10
60000/60000 [==============================] - 1s 20us/step - loss: 0.7016 - acc: 0.8540
Epoch 8/10
60000/60000 [==============================] - 1s 20us/step - loss: 0.6402 - acc: 0.8635
Epoch 9/10
60000/60000 [==============================] - 1s 21us/step - loss: 0.5921 - acc: 0.8701
Epoch 10/10
60000/60000 [==============================] - 1s 19us/step - loss: 0.5538 - acc: 0.8749
10000/10000 [==============================] - 0s 15us/step
Test loss: 0.5261379142284394
Test accuracy: 0.8839

Predictions

In [0]:
predictions = model.predict(x_test)
In [0]:
print(predictions[11])
[0.03 0.12 0.2  0.03 0.02 0.05 0.42 0.   0.12 0.01]
In [0]:
 
In [0]:
numpy.sum(predictions[11])
Out[0]:
1.0000001
In [0]:
numpy.argmax(predictions[11])
Out[0]:
6
In [0]:
# Look at confusion matrix 
#Note, this code is taken straight from the SKLEARN website, an nice way of viewing confusion matrix.
def plot_confusion_matrix(cm, classes,
                          normalize=False,
                          title='Confusion matrix',
                          cmap=plt.cm.Blues):
    """
    This function prints and plots the confusion matrix.
    Normalization can be applied by setting `normalize=True`.
    """
    plt.imshow(cm, interpolation='nearest', cmap=cmap)
    plt.title(title)
    plt.colorbar()
    tick_marks = numpy.arange(len(classes))
    plt.xticks(tick_marks, classes, rotation=45)
    plt.yticks(tick_marks, classes)

    if normalize:
        cm = cm.astype('float') / cm.sum(axis=1)[:, np.newaxis]

    thresh = cm.max() / 2.
    for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
        plt.text(j, i, cm[i, j],
                 horizontalalignment="center",
                 color="white" if cm[i, j] > thresh else "black")

    plt.tight_layout()
    plt.ylabel('Actual class')
    plt.xlabel('Predicted class')
In [0]:
from collections import Counter
from sklearn.metrics import confusion_matrix
import itertools

# Predict the values from the validation dataset
Y_pred = model.predict(x_test)
# Convert predictions classes to one hot vectors 
Y_pred_classes = numpy.argmax(Y_pred, axis = 1) 
# Convert validation observations to one hot vectors
Y_true = numpy.argmax(y_test, axis = 1) 
# compute the confusion matrix
confusion_mtx = confusion_matrix(Y_true, Y_pred_classes) 
# plot the confusion matrix
plot_confusion_matrix(confusion_mtx, classes = range(10))

SEO Language Tags

Language Meta Tag

<head>
<meta http-equiv="content-language" content="en-gb" />
</head>

Language HTTP Response Header

Content-Language

HTTP/1.1 200 OK
Content-Language: en-gb

Link

Link: <http://praison.com/file.pdf>; rel="alternate"; hreflang="en",
<http://de-ch.praison.com/file.pdf>; rel="alternate"; hreflang="de-ch",
<http://de.praison.com/file.pdf>; rel="alternate"; hreflang="de"

Language <html> Tag Attribute

<html lang="en-gb">
...
</html>

Language Hreflang Tag

<link rel="alternate" hreflang="en-gb" href="https://praison.com/en-gb/" />

XML Sitemap Tag

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:xhtml="http://www.w3.org/1999/xhtml">
<url>
<loc>https://prasion.com/english/page.html</loc>
<xhtml:link 
rel="alternate"
hreflang="de"
href="https://prasion.com/deutsch/page.html"/>
<xhtml:link 
rel="alternate"
hreflang="de-ch"
href="https://prasion.com/schweiz-deutsch/page.html"/>
<xhtml:link 
rel="alternate"
hreflang="en"
href="https://prasion.com/english/page.html"/>
</url>
<url>
<loc>https://prasion.com/deutsch/page.html</loc>
<xhtml:link 
rel="alternate"
hreflang="de"
href="https://prasion.com/deutsch/page.html"/>
<xhtml:link 
rel="alternate"
hreflang="de-ch"
href="https://prasion.com/schweiz-deutsch/page.html"/>
<xhtml:link 
rel="alternate"
hreflang="en"
href="https://prasion.com/english/page.html"/>
</url>
<url>
<loc>https://prasion.com/schweiz-deutsch/page.html</loc>
<xhtml:link 
rel="alternate"
hreflang="de"
href="https://prasion.com/deutsch/page.html"/>
<xhtml:link 
rel="alternate"
hreflang="de-ch"
href="https://prasion.com/schweiz-deutsch/page.html"/>
<xhtml:link 
rel="alternate"
hreflang="en"
href="https://prasion.com/english/page.html"/>
</url>
</urlset>

 

Selenium Firefox Webdriver Python Setup

Python Code

#Packages

from bs4 import BeautifulSoup
import requests
import pandas as pd
import numpy as np
import csv
import re
from selenium import webdriver


#driver = webdriver.Firefox(capabilities={"marionette":False})
caps = webdriver.DesiredCapabilities.FIREFOX
caps["marionette"] = False
driver = webdriver.Firefox(capabilities=caps)


driver.get("https://www.google.com")

print (driver.title)

Python Code with Headless Firefox

 

#Packages

from bs4 import BeautifulSoup
import requests
import pandas as pd
import numpy as np
import csv
import re
from selenium import webdriver
from selenium.webdriver.firefox.options import Options
options = Options()
options.add_argument("--headless")


#driver = webdriver.Firefox(capabilities={"marionette":False})
caps = webdriver.DesiredCapabilities.FIREFOX
caps["marionette"] = False
driver = webdriver.Firefox(capabilities=caps, firefox_options=options)


driver.get("https://www.google.com")

print (driver.title)

WebDriverException: Message: Can’t load the profile

Try,

caps["marionette"] = True




Get all Divs using Selenium Driver Python

X Path

divs = driver.find_elements_by_xpath('//li/div')

CSS Selector

divs = driver.find_elements_by_css_selector('li > div')

Add Bootstrap to WordPress admin sub-menu page

Add Bootstrap to a plugin page

// custom css and js
 add_action('admin_enqueue_scripts', 'cstm_css_and_js');

function cstm_css_and_js($hook) {
     // your-slug => The slug name to refer to this menu used in "add_submenu_page"
     // tools_page => refers to Tools top menu, so it's a Tools' sub-menu page
     if ( 'tools_page_your-slug' != $hook ) {
         return;
     }

    wp_enqueue_style('boot_css', plugins_url('inc/bootstrap.css',__FILE__ ));
     wp_enqueue_script('boot_js', plugins_url('inc/bootstrap.js',__FILE__ ));
 }

Synchronising between Git and SVN

Synchronizing updates between the two repositories.

1. Clone the GitHub Repo

SSH

$ git clone git@github.com:Praison/seo-wordpress.git

HTTPS

$ git clone https://github.com/Praison/seo-wordpress.git

2. Change into the Directory

$ cd seo-wordpress

3. Set Up a Subversion tracking branch

$ git branch --no-track svnsync 
$ git checkout svnsync
$ git svn init -s https://plugins.svn.wordpress.org/seo-wordpress/ --prefix=origin/
$ git svn fetch --log-window-size 10000 #CAUTION THIS LINE TAKES A LONG TIME TO COMPLETE
$ git reset --hard origin/trunk

4. Merge changes from Subversion to GitHub

$ git checkout svnsync
$ git svn rebase 
$ git checkout master 
$ git merge svnsync 
$ git push origin master

5. Merge changes from GitHub and publish to SubVersion

$ git checkout master
$ git pull origin master 
$ git checkout svnsync 
$ git svn rebase 
$ git merge --no-ff master 
$ git commit 
$ git svn dcommit

### Tagging Releases
Tagging a release in Git is very simple:

$ git tag v1.0.4

To create an SVN tag, simply:

$ git svn tag 1.0.4

This will create `/tags/1.0.4` in the remote SVN repository and copy all the files from the remote `/trunk` into that tag, so be sure to push all the latest code to `/trunk` before creating an SVN tag.

### Subversion tagging

It appears that there is now an issue with git svn tagging. We now have to tag using subversion directly.
Download code from svn Repo

$ svn checkout https://plugins.svn.wordpress.org/stop-web-crawlers/
$ svn cp https://plugins.svn.wordpress.org/stop-web-crawlers/trunk https://plugins.svn.wordpress.org/stop-web-crawlers/tags/1.3.1