Hands-on Scikit-Learn for machine learning applications: data science fundamentals with Python David Paper all chapter instant download
Hands-on Scikit-Learn for machine learning applications: data science fundamentals with Python David Paper all chapter instant download
com
https://textbookfull.com/product/hands-on-scikit-learn-for-
machine-learning-applications-data-science-fundamentals-
with-python-david-paper/
OR CLICK BUTTON
DOWNLOAD NOW
https://textbookfull.com/product/hands-on-machine-learning-with-
scikit-learn-and-tensorflow-aurelien-geron/
textboxfull.com
https://textbookfull.com/product/hands-on-machine-learning-with-
scikit-learn-and-tensorflow-1st-edition-aurelien-geron/
textboxfull.com
https://textbookfull.com/product/hands-on-machine-learning-with-
scikit-learn-and-tensorflow-early-release-2nd-edition-aurelien-geron/
textboxfull.com
https://textbookfull.com/product/data-science-fundamentals-for-python-
and-mongodb-1st-edition-david-paper/
textboxfull.com
Hands On Machine Learning with Scikit Learn Keras and
TensorFlow Concepts Tools and Techniques to Build
Intelligent Systems 2nd Edition Aurelien Geron [Géron
https://textbookfull.com/product/hands-on-machine-learning-with-
scikit-learn-keras-and-tensorflow-concepts-tools-and-techniques-to-
build-intelligent-systems-2nd-edition-aurelien-geron-geron/
textboxfull.com
https://textbookfull.com/product/introduction-to-machine-learning-
with-python-a-guide-for-data-scientists-andreas-c-muller/
textboxfull.com
https://textbookfull.com/product/advanced-data-analytics-using-python-
with-machine-learning-deep-learning-and-nlp-examples-mukhopadhyay/
textboxfull.com
David Paper
Trademarked names, logos, and images may appear in this book. Rather
than use a trademark symbol with every occurrence of a trademarked
name, logo, or image we use the names, logos, and images only in an
editorial fashion and to the benefit of the trademark owner, with no
intention of infringement of the trademark. The use in this publication
of trade names, trademarks, service marks, and similar terms, even if
they are not identified as such, is not to be taken as an expression of
opinion as to whether or not they are subject to proprietary rights.
While the advice and information in this book are believed to be true
and accurate at the date of publication, neither the authors nor the
editors nor the publisher can accept any legal responsibility for any
errors or omissions that may be made. The publisher makes no
warranty, express or implied, with respect to the material contained
herein.
1. Introduction to Scikit-Learn
David Paper1
Scikit-Learn is a Python library that provides simple and efficient tools for implementing
supervised and unsupervised machine learning algorithms. The library is accessible to everyone
because it is open source and commercially usable. It is built on NumPY, SciPy, and matplolib
libraries, which means it is reliable, robust, and core to the Python language.
Scikit-Learn is focused on data modeling rather than data loading, cleansing, munging or
manipulating. It is also very easy to use and relatively clean of programming bugs.
Machine Learning
Machine learning is getting computers to program themselves. We use algorithms to make this
happen. An algorithm is a set of rules used to calculate or problem solve with a computer.
Machine learning advocates create, study, and apply algorithms to improve performance on
data-driven tasks. They use tools and technology to answer questions about data by training a
machine how to learn.
The goal is to build robust algorithms that can manipulate input data to predict an output
while continually updating outputs as new data becomes available. Any information or data sent
to a computer is considered input. Data produced by a computer is considered output.
In the machine learning community, input data is referred to as the feature set and output data
is referred to as the target. The feature set is also referred to as the feature space. Sample data is
typically referred to as training data. Once the algorithm is trained with sample data, it can make
predictions on new data. New data is typically referred to as test data.
Machine learning is divided into two main areas: supervised and unsupervised learning. Since
machine learning typically focuses on prediction based on known properties learned from
training data, our focus is on supervised learning.
Supervised learning is when the data set contains both inputs (or the feature set) and desired
outputs (or targets). That is, we know the properties of the data. The goal is to make predictions.
This ability to supervise algorithm training is a big part of why machine learning has become so
popular.
To classify or regress new data, we must train on data with known outcomes. We classify data
by organizing it into relevant categories. We regress data by finding the relationship between
feature set data and target data.
With unsupervised learning, the data set contains only inputs but no desired outputs (or
targets). The goal is to explore the data and find some structure or way to organize it. Although
not the focus of the book, we will explore a few unsupervised learning scenarios.
Anaconda
You can use any Python installation, but I recommend installing Python with Anaconda for several
reasons. First, it has over 15 million users. Second, Anaconda allows easy installation of the
desired version of Python. Third, it preinstalls many useful libraries for machine learning
including Scikit-Learn. Follow this link to see the Anaconda package lists for your operating
system and Python version: https://docs.anaconda.com/anaconda/packages/pkg-
docs/. Fourth, it includes several very popular editors including IDLE, Spyder, and Jupyter
Notebooks. Fifth, Anaconda is reliable and well-maintained and removes compatibility
bottlenecks.
You can easily download and install Anaconda with this link:
https://www.anaconda.com/download/. You can update with this link:
https://docs.anaconda.com/anaconda/install/update-version/. Just open
Anaconda and follow instructions. I recommend updating to the current version.
Scikit-Learn
Python’s Scikit-Learn is one of the most popular machine learning libraries. It is built on Python
libraries NumPy, SciPy, and Matplotlib. The library is well-documented, open source,
commercially usable, and a great vehicle to get started with machine learning. It is also very
reliable and well-maintained, and its vast collection of algorithms can be easily incorporated into
your projects. Scikit-Learn is focused on modeling data rather than loading, manipulating,
visualizing, and summarizing data. For such activities, other libraries such as NumPy, pandas,
Matplotlib, and seaborn are covered as encountered. The Scikit-Learn library is imported into a
Python script as sklearn.
Data Sets
A great way to understand machine learning application is by working through Python data-
driven code examples. We use either Scikit-Learn, UCI Machine Learning, or seaborn data sets for
all examples. The Scikit-Learn data sets package embeds some small data sets for getting started
and helpers to fetch larger data sets commonly used in the machine learning library to benchmark
algorithms on data from the world at large. The UCI Machine Learning Repository maintains 468
data sets to serve the machine learning community. Seaborn provides an API on top of Matplotlib
that offers simplicity when working with plot styles, color defaults, and high-level functions for
common statistical plot types that facilitate visualization. It also integrates nicely with Pandas
DataFrame functionality.
We chose the data sets for our examples because the machine learning community uses them
for learning, exploring, benchmarking, and validating, so we can compare our results to others
while learning how to apply machine learning algorithms.
Our data sets are categorized as either classification or regression data. Classification data
complexity ranges from simple to relatively complex. Simple classification data sets include
load_iris, load_wine, bank.csv, and load_digits. Complex classification data sets include
fetch_20newsgroups, MNIST, and fetch_1fw_people. Regression data sets include tips, redwine.csv,
whitewine.csv, and load_boston.
Characterize Data
Before working with algorithms, it is best to understand the data characterization. Each data set
was carefully chosen to help you gain experience with the most common aspects of machine
learning. We begin by describing the characteristics of each data set to better understand its
composition and purpose. Data sets are organized by classification and regression data.
Classification data is further organized by complexity. That is, we begin with simple
classification data sets that are not complex so that the reader can focus on the machine learning
content rather than on the data. We then move onto more complex data sets.
Iris Data
The first data set we characterize is load_iris, which consists of Iris flower data. Iris is a
multivariate data set consisting of 50 samples from each of three species of iris (Iris setosa, Iris
virginica, and Iris versicolor). Each sample contains four features, namely, length and width of
sepals and petals in centimeters. Iris is a typical test case for machine learning classification. It is
also one of the best known data sets in the data science literature, which means you can test your
results against many other verifiable examples.
The first code example shown in Listing 1-1 loads Iris data, displays its keys, shape of the
feature set and target, feature and target names, a slice from the DESCR key, and feature
importance (from most to least).
if __name__ == "__main__":
br = '\n'
iris = datasets.load_iris()
keys = iris.keys()
print (keys, br)
X = iris.data
y = iris.target
print ('features shape:', X.shape)
print ('target shape:', y.shape, br)
features = iris.feature_names
targets = iris.target_names
print ('feature set:')
print (features, br)
print ('targets:')
print (targets, br)
print (iris.DESCR[525:900], br)
rnd_clf = RandomForestClassifier(random_state=0,
n_estimators=100)
rnd_clf.fit(X, y)
rnd_name = rnd_clf.__class__.__name__
feature_importances = rnd_clf.feature_importances_
importance = sorted(zip(feature_importances, features),
reverse=True)
print ('most important features' + ' (' + rnd_name + '):')
[print (row) for i, row in enumerate(importance)]
Listing 1-1 Characterize the Iris data set
Go ahead and execute the code from Listing 1-1. Remember that you can find the example
from the book’s example download. You don’t need to type the example by hand. It’s easier to
access the example download and copy/paste.
Your output from executing Listing 1-1 should resemble the following:
feature set:
['sepal length (cm)', 'sepal width (cm)', 'petal length (cm)', 'petal
width (cm)']
targets:
['setosa' 'versicolor' 'virginica']
Tip RandomForestClassifier is a powerful machine learning algorithm that not only models
training data, but returns feature importance.
Wine Data
The next data set we characterize is load_wine. The load_wine data set consists of 178 data
elements. Each element has thirteen features that describe three target classes. It is considered a
classic in the machine learning community and offers an easy multi-classification data set.
The next code example shown in Listing 1-2 loads wine data and displays its keys, shape of the
feature set and target, feature and target names, a slice from the DESCR key, and feature
importance (from most to least).
if __name__ == "__main__":
br = '\n'
data = load_wine()
keys = data.keys()
print (keys, br)
X, y = data.data, data.target
print ('features:', X.shape)
print ('targets', y.shape, br)
print (X[0], br)
features = data.feature_names
targets = data.target_names
print ('feature set:')
print (features, br)
print ('targets:')
print (targets, br)
rnd_clf = RandomForestClassifier(random_state=0,
n_estimators=100)
rnd_clf.fit(X, y)
rnd_name = rnd_clf.__class__.__name__
feature_importances = rnd_clf.feature_importances_
importance = sorted(zip(feature_importances, features),
reverse=True)
n = 6
print (n, 'most important features' + ' (' + rnd_name + '):')
[print (row) for i, row in enumerate(importance) if i < n]
Listing 1-2 Characterize load_wine
After executing code from Listing 1-2, your output should resemble the following:
feature set:
['alcohol', 'malic_acid', 'ash', 'alcalinity_of_ash', 'magnesium',
'total_phenols', 'flavanoids', 'nonflavanoid_phenols',
'proanthocyanins', 'color_intensity', 'hue',
'od280/od315_of_diluted_wines', 'proline']
targets:
['class_0' 'class_1' 'class_2']
Tip To create (instantiate) a machine learning algorithm (model), just assign it to a variable
(e.g., model = algorithm()). To train based on the model, just fit it to the data (e.g., model.fit(X,
y)).
The code begins by importing load_wine and RandomForestClassifier. The main block displays
keys, loads data into X and y, displays the first vector from feature set X, displays shapes, and
displays feature set and target information. The code concludes by training X with
RandomForestClassifier, so we can display the six most important features. Notice that we display
the first vector from feature set X to verify that all features are numeric.
Bank Data
The next code example shown in Listing 1-3 works with bank data. The bank.csv data set is
composed of direct marketing campaigns from a Portuguese banking institution. The target is
described by whether a client will subscribe (yes/no) to a term deposit (target label y). It consists
of 41188 data elements with 20 features for each element. A 10% random sample of 4119 data
elements is also available from this site for more computationally expensive algorithms such as
svm and KNeighborsClassifier.
import pandas as pd
if __name__ == "__main__":
br = '\n'
f = 'data/bank.csv'
bank = pd.read_csv(f)
features = list(bank)
print (features, br)
X = bank.drop(['y'], axis=1).values
y = bank['y'].values
print (X.shape, y.shape, br)
print (bank[['job', 'education', 'age', 'housing',
'marital', 'duration']].head())
Listing 1-3 Characterize bank data
After executing code from Listing 1-3, your output should resemble the following:
The code example begins by importing the pandas package. The main block loads bank data
from a CSV file into a Pandas DataFrame and displays the column names (or features). To retrieve
column names from pandas, all we need to do is make the DataFrame a list and assign the result
to a variable. Next, feature set X and target y are created. Finally, X and y shapes are displayed as
well as a few choice features.
Digits Data
The final code example in this subsection is load_digits. The load_digits data set consists of 1797 8
× 8 handwritten images. Each image is represented by 64 pixels (based on an 8 × 8 matrix), which
make up the feature set. Ten targets are predicted represented by digits zero to nine.
Listing 1-4 contains the code that characterizes load_digits.
import numpy as np
from sklearn.datasets import load_digits
import matplotlib.pyplot as plt
if __name__ == "__main__":
br = '\n'
digits = load_digits()
print (digits.keys(), br)
print ('2D shape of digits data:', digits.images.shape, br)
X = digits.data
y = digits.target
print ('X shape (8x8 flattened to 64 pixels):', end=' ')
print (X.shape)
print ('y shape:', end=' ')
print (y.shape, br)
i = 500
print ('vector (flattened matrix) of "feature" image:')
print (X[i], br)
print ('matrix (transformed vector) of a "feature" image:')
X_i = np.array(X[i]).reshape(8, 8)
print (X_i, br)
print ('target:', y[i], br)
print ('original "digits" image matrix:')
print (digits.images[i])
plt.figure(1, figsize=(3, 3))
plt.title('reshaped flattened vector')
plt.imshow(X_i, cmap="gray", interpolation="gaussian")
plt.figure(2, figsize=(3, 3))
plt.title('original images dataset')
plt.imshow(digits.images[i], cmap="gray",
interpolation='gaussian')
plt.show()
Listing 1-4 Characterize load_digits
After executing code from Listing 1-4, your output should resemble the following:
target: 8
The code begins by importing numpy, load_digits, and matplotlib packages. The main block
places load_digits into the digits variable and displays its keys: data, target, target_names, images,
and DESCR. It continues by displaying the two-dimensional (2D) shape of images contained in
images. Data in images are represented by 1797 8 × 8 matrices. Next, feature data (represented as
vectors) are placed in X and target data in y.
A feature vector is one that contains information about an object’s important characteristics.
Data in data are represented by 1797 64-pixel feature vectors. A simple feature representation of
an image is the raw intensity value of each pixel. So, an 8 × 8 image is represented by 64 pixels.
Machine learning algorithms process feature data as vectors, so each element in data must be a
one-dimensional (1D) vector representation of its 2D image matrix.
Tip Feature data must be composed of vectors to work with machine learning algorithm.
The code continues by displaying the feature vector of the 500th image. Next, the 500th feature
vector is transformed from its flattened 1D vector shape into a 2D image matrix and displayed
with the NumPy reshape function. The code continues by displaying the target value y of the
500th image. Next, the 500th image matrix is displayed by referencing images.
The reason we transformed the image from its 1D flattened vector state to the 2D image
matrix is that most data sets don’t include an images object like load_data. So, to visualize and
process data with machine learning algorithms, we must be able to manually flatten images and
transform flattened images back to their original 2D matrix shape.
The code concludes by visualizing the 500th image in two ways. First, we use the flattened
vector X_i. Second, we reference images. While machine learning algorithms require feature
vectors, function imshow requires 2D image matrices to visualize.
Newsgroup Data
The first data set we characterize is fetch_20newsgroups, which consists of approximately 18000
posts on 20 topics. Data is split into train-test subsets. The split is based on messages posted
before and after a specific date.
Listing 1-5 contains the code that characterizes fetch_20newsgroups.
if __name__ == "__main__":
br = '\n'
train = fetch_20newsgroups(subset='train')
test = fetch_20newsgroups(subset='test')
print ('data:')
print (train.target.shape, 'shape of train data')
print (test.target.shape, 'shape of test data', br)
targets = test.target_names
print (targets, br)
categories = ['rec.autos', 'rec.motorcycles', 'sci.space',
'sci.med']
train = fetch_20newsgroups(subset='train',
categories=categories)
test = fetch_20newsgroups(subset='test',
categories=categories)
print ('data subset:')
print (train.target.shape, 'shape of train data')
print (test.target.shape, 'shape of test data', br)
targets = train.target_names
print (targets)
Listing 1-5 Characterize fetch_20newsgroups
After executing code from Listing 1-5, your output should resemble the following:
data:
(11314,) shape of train data
(7532,) shape of test data
data subset:
(2379,) shape of train data
(1584,) shape of test data
The code begins by importing fetch_20newsgroups. The main block begins by loading train
and test data and displaying their shapes. Training data consists of 11314 postings, while test
data consists of 7532 postings. The code continues by displaying target names and categories.
Next, train and test data are created from a subset of categories. The code concludes by displaying
shapes and target names of the subset.
MNIST Data
The next data set we characterize is MNIST. MNIST (Modified National Institute of Standards and
Technology) is a large database of handwritten digits commonly used for training and testing in
the machine learning community and other industrial image processing applications. MNIST
contains 70000 examples of handwritten digit images labeled from 0 to 9 of size 28 × 28. Each
target (or label) is stored as a digit value. The feature set is a matrix of 70000 28 × 28 images
automatically flattened to 784 pixels each. So, each of the 70000 data elements is a vector of
length 784. The target set is a vector of 70000 digit values.
Listing 1-6 contains the code that characterizes MNIST.
import numpy as np
from random import randint
import matplotlib.pyplot as plt
if __name__ == "__main__":
br = '\n'
X = np.load('data/X_mnist.npy')
y = np.load('data/y_mnist.npy')
target = np.load('data/mnist_targets.npy')
print ('labels (targets):')
print (target, br)
print ('feature set shape:')
print (X.shape, br)
print ('target set shape:')
print (y.shape, br)
indx = randint(0, y.shape[0]-1)
target = y[indx]
X_pixels = np.array(X[indx])
print ('the feature image consists of', len(X_pixels),
'pixels')
X_image = X_pixels.reshape(28, 28)
plt.figure(1, figsize=(3, 3))
title = 'image @ indx ' + str(indx) + ' is digit ' \
+ str(int(target))
plt.title(title)
plt.imshow(X_image, cmap="gray")
digit = 7
target, X_pixels = find_image(X, y, digit)
X_image = X_pixels.reshape(28, 28)
plt.figure(2, figsize=(3, 3))
title = 'find first ' + str(int(target)) + ' in dataset'
plt.title(title)
plt.imshow(X_image, cmap="gray")
plt.show()
Listing 1-6 Characterize MNIST
After executing code from Listing 1-6, your output should resemble the following:
labels (targets):
[0. 1. 2. 3. 4. 5. 6. 7. 8. 9.]
Listing 1-6 also displays Figures 1-3 and 1-4. Figure 1-3 is the reshaped image of digit 1 at
index 6969. Figure 1-4 is the first image of digit 7 in the data set.
Figure 1-3 Reshaped flattened vector of image at index 6969
import numpy as np
import matplotlib.pyplot as plt
if __name__ == "__main__":
br = '\n'
X = np.load('data/X_faces.npy')
y = np.load('data/y_faces.npy')
targets = np.load('data/faces_targets.npy')
print ('shape of feature and target data:')
print (X.shape)
print (y.shape, br)
print ('target faces:')
print (targets)
X_i = np.array(X[0]).reshape(50, 37)
image_name = targets[y[0]]
fig, ax = plt.subplots()
image = ax.imshow(X_i, cmap="bone")
plt.title(image_name)
plt.show()
Listing 1-7 Characterize fetch_1fw_people
After executing code from Listing 1-7, your output should resemble the following:
target faces:
['Ariel Sharon' 'Colin Powell' 'Donald Rumsfeld' 'George W Bush'
'Gerhard Schroeder' 'Hugo Chavez' 'Tony Blair']
Listing 1-7 also displays Figure 1-5. Figure 1-5 is the reshaped image of the first data element
in the data set.
Figure 1-5 Reshaped image of the first data element in the data set
The code begins by importing requisite packages. The main block loads data into X, y, and
targets from NumPy files. The code continues by printing shapes of X and y. X contains 1288
1850-pixel vectors and y contains 1288 target values. Target labels are then displayed. The code
concludes by reshaping the first feature vector to a 50 × 37 image and displaying it with function
imshow.
Regression Data
We now change gears away from classification and move into regression. Regression is a machine
learning technique for predicting a numerical value based on the independent variables (or
feature set) of a data set. That is, we are measuring the impact of the feature set on a numerical
output. The first data set we characterize for regression is tips.
Tips Data
The tips data set is integrated with the seaborn library. It consists of food server tips in
restaurants and related factors including tip, price of meal, and time of day. Specifically, features
include total_bill (price of meal), tip (gratuity), sex (male or female), smoker (yes or no), day
(Thursday, Friday, Saturday, or Sunday), time (day or night), and size of the party. Features are
coded as follows: total_bill (US dollars), tip (US dollars), sex (0=male, 1=female), smoker (0=no,
1=yes), day (3=Thur, 4=Fri, 5= Sat, 6=Sun). Tips data is represented by 244 elements with six
features predicting one target. The target being tips received from customers.
Listing 1-8 characterizes tips data.
if __name__ == "__main__":
br = '\n'
sns.set(color_codes=True)
tips = sns.load_dataset('tips')
print (tips.head(), br)
X = tips.drop(['tip'], axis=1).values
y = tips['tip'].values
print (X.shape, y.shape)
Listing 1-8 Characterize the tips data set
After executing code from Listing 1-8, your output should resemble the following:
(244, 6) (244,)
The code begins by loading tips as a Pandas DataFrame, displaying the first five records,
converting data to NumPy, and displaying the feature set and target shapes. Seaborn data is
automatically loaded as a Pandas DataFrame. We couldn’t get feature importance because
RandomForestClassifier expects numeric data. It takes a great deal of data wrangling to get the
data set into this form. We will transform categorical data to numeric in later chapters.
import pandas as pd
from sklearn.ensemble import RandomForestRegressor
if __name__ == "__main__":
br = '\n'
f = 'data/redwine.csv'
red_wine = pd.read_csv(f)
X = red_wine.drop(['quality'], axis=1)
y = red_wine['quality']
print (X.shape)
print (y.shape, br)
features = list(X)
rfr = RandomForestRegressor(random_state=0,
n_estimators=100)
rfr_name = rfr.__class__.__name__
rfr.fit(X, y)
feature_importances = rfr.feature_importances_
importance = sorted(zip(feature_importances, features),
reverse=True)
n = 3
print (n, 'most important features' + ' (' + rfr_name + '):')
[print (row) for i, row in enumerate(importance) if i < n]
for row in importance:
print (row)
print ()
print (red_wine[['alcohol', 'sulphates', 'volatile acidity',
'total sulfur dioxide', 'quality']]. head())
Listing 1-9 Characterize redwine
After executing code from Listing 1-9, your output should resemble the following:
(1599, 11)
(1599,)
The code example begins by loading pandas and RandomForestRegressor packages. The main
block loads redwine.csv into a Pandas DataFrame. It then displays feature and target shapes. The
code concludes by training pandas data with RandomForestRegressor, displaying the three most
important features, and displaying the first five records from the data set.
RandomForestRegressor is also an ensemble algorithm, but it is used when the target is numeric
or continuous.
Tip Always hard-code random_state (e.g., random_state=0) for algorithms that use this
parameter to stabilize results.
The white wine example follows the exact same logic, but output differs in terms of data set size
and feature importance.
Listing 1-10 characterizes whitewine.csv.
if __name__ == "__main__":
br = '\n'
f = 'data/whitewine.csv'
white_wine = pd.read_csv(f)
X = white_wine.drop(['quality'], axis=1)
y = white_wine['quality']
print (X.shape)
print (y.shape, br)
features = list(X)
rfr = RandomForestRegressor(random_state=0,
n_estimators=100)
rfr_name = rfr.__class__.__name__
rfr.fit(X, y)
feature_importances = rfr.feature_importances_
importance = sorted(zip(feature_importances, features),
reverse=True)
n = 3
print (n, 'most important features' + ' (' + rfr_name + '):')
[print (row) for i, row in enumerate(importance) if i < n]
print ()
print (white_wine[['alcohol', 'sulphates',
'volatile acidity',
'total sulfur dioxide',
'quality']]. head())
Listing 1-10 Characterize whitewine
After executing code from Listing 1-10, your output should resemble the following:
(4898, 11)
(4898,)
In this division of Russia the best principality, Kief, went to the eldest
son; the second in value, Chernigoff, to the second son, and so on.
The idea was to give each prince a place whose income
corresponded to his rank in the scale of seniority. Kief, besides its
superior income, carried with it the sovereignty of Russia.
Let us follow the working of this system. In 1057, three years after
Yaroslav’s death, died the fourth brother, Vyacheslav of Smolensk,
leaving one son. Igor of Volynia was transferred to Smolensk by his
brothers, and Rostislav, the nephew, was [25]moved from Rostoff to
Volynia. In 1060 Igor died in Smolensk, leaving sons also. The
remaining three brothers gave Smolensk neither to Igor’s sons, nor
to Rostislav, to whom, by the established order, it would belong.
In 1058 the four surviving brothers freed their uncle Sudislav from
prison, where Yaroslav, his brother, had kept him for eighteen years.
They took from him an oath to act in no way against them. Old and
childless, he entered a monastery, and died five years later.
Rostislav now took Tmutarakan from Glaib, son of Sviatoslav.
Sviatoslav hurried to help his son, and, as Rostislav did not resist his
uncle, Glaib was put back into power very promptly. No sooner was
Sviatoslav at home, however, than Glaib was driven out a second
time by Rostislav, who now settled down firmly and with a purpose.
He began at once to extend his dominion along the Caucasus, and
was rapidly gaining power to use against his uncles, when the
Greeks of the Chersonese poisoned him, and Glaib took his old place
again unhindered.
The three sons of Yaroslav were rid now of their nephew, but they
had a cousin who began to give them much trouble. This cousin was
Vseslav of Polotsk, grandson of Izyaslav, the eldest brother of
Yaroslav the Lawgiver. This Vseslav was known to be desperate in
battle, and swift beyond any man in marching. People believed him
born through enchantment, they thought him a real devil’s son, who
could turn to a gray wolf and race in one night from the Caucasus to
Novgorod. This so called “wizard,” excluded from the sovereign
circle, now began war in defense of rights which to him, the great-
grandson of Vladimir the Apostle, might indeed seem well founded.
In 1065 the wizard attacked Pskoff, meeting with no success, but the
following year he entered Novgorod, captured many people, took
down the great bell of Sophia, seized church ornaments and hurried
away. “Immense was the misery of that day,” states the chronicler.
Izyaslav and his brothers pursued Vseslav during terrible cold, for
the time was midwinter. On the road [26]they halted at Minsk. The
people had shut themselves up in the stronghold; so they stormed
the stronghold and captured it, cutting down all defenders, sparing
only women and children as captives.
They followed Vseslav till early in March, when they overtook him,
and notwithstanding a blinding snow-storm, there was a terrible
battle. Many fell on both sides. Vseslav was defeated, but he
escaped, as he always did, because of his swiftness and “magic.”
The Prince of Kief and his brothers had rest now from relatives. But
some great calamity was coming, every one felt it; there were
portents on all sides. A bloody star appeared in the sky and
remained a whole month there; the sun was as pale as the moon; a
deformed fish had been caught, enormous and dreadful to look at.
While all men were convinced that some terror was approaching,
and were waiting in fear to see what it might be, the Polovtsi, a new
scourge, appeared. They had conquered the Petchenegs and were
now ready to harass Russia. Kazars, Torks and Petchenegs had
preceded them in this office, but the Polovtsi were Russia’s direst
foes thus far.
All men in Kief were enraged at Izyaslav. Some demanded arms, and
others a prince who would lead them successfully against the
Polovtsi. They rushed to the prison, freed Vseslav the wizard, and
made him Grand Prince immediately. Izyaslav, to save his life,
hastened westward to Poland. The Polovtsi advanced to Chernigoff,
where Sviatoslav met the plundering host and crushed it.
Seven months after his flight, Izyaslav appeared before Kief with a
numerous army commanded by Boleslav the Bold, King of Poland.
Vseslav went forth to meet him, and it is told of him that, since he
could hope for no favor from Vsevolod or Sviatoslav against Izyaslav,
their brother, the wizard became a gray wolf in the night-time and
vanished. In fact he fled. The army, deserted by its leader, returned
to Kief and sent the following message to Sviatoslav and Vsevolod:
“Unless ye save Kief from the Poles, we will burn it and go to the
land of the Greeks.” “We will warn our brother,” replied Sviatoslav,
“we will not permit him to enter the city with large forces.”
Izyaslav, warned by his brothers, came with only a part of the army,
was received and took his place as of old in the capital. As soon as
he left the Kief army, Vseslav hurried off to Polotsk and took
possession of that city.
The Polish king and Henry were enemies at this time, hence
Sviatoslav made a treaty at once with the king, and sent Oleg, his
own son, with Monomach, son of Vsevolod, to assist him. Henry’s
efforts were vain, so Izyaslav’s son visited Rome to beg aid of
Gregory, the seventh of that name, the strong Pope who forced
Henry IV to stand thinly clad in the cold at Canossa.
The Russian prince declared that his father was ready to recognize
papal supremacy, if Gregory would only restore Kief to him. The
Pope wrote at once to the Polish king, touching the gifts which he
had taken from Izyaslav before sending him out of the country.
At this juncture Henry’s ally, the Bohemian king, Vratislav, heard that
two Russian princes were coming with warriors to attack him. He
asked peace of Boleslav, and obtained it for one thousand grievens
in silver. Boleslav then directed Oleg and Monomach to return, as
peace had been concluded. They replied that they could not go back
without shame, unless they won honor. Hence they advanced to get
honor. During four months they “went through” Vratislav’s land—to
“go through” means to ravage. Vratislav then gave them a thousand
grievens in silver for peace. They made peace, and returned home
with the money—and with honor.
Two years later, 1078, Oleg and his cousin Boris led an army of
Polovtsi and others to Chernigoff, where they attacked Vsevolod and
defeated him. Vsevolod turned then to Izyaslav for assistance, and
the two princes, with Yaropolk and Monomach, their sons, marched
against Oleg and his cousin. Boris was killed in the front of the
battle, and a spear went through the body of Izyaslav, the Grand
Prince. Though these two princes fell, the battle continued till Oleg’s
forces were broken and he was swept from [30]the field, escaping
with great difficulty. Thus one son and one grandson of Yaroslav fell
in this desperate struggle between uncles and nephews (October,
1078).
Now Vsevolod, the last son of Yaroslav the Lawgiver, became Grand
Prince, and the difficulties before him were enormous.
The following year, 1086, Yaropolk came back from Poland, made
peace with Monomach and was again seated in Volynia. Still his
lands could not have been of great use to him, since soon after his
coming he set out for Zvenigorod. He was slain on the road by a
man named Neradets, who escaped and took refuge with Rurik, son
of Rostislav, in Galitch.
That same year Vsevolod moved against Volodar and Vassilko, but in
the end made peace with them. After that there was rest for a time
in Volynia. But there was sharp trouble with Vseslav the wizard, who
at Vsevolod’s accession had “scorched” Smolensk, that is burned it,
all save the stronghold. Monomach [32]hunted him swiftly with men
doubly mounted, 6 but the wizard escaped. A second hunt followed,
by men from Chernigoff and Polovtsi allies. On the way they took
Minsk by surprise, and left not a man or beast in the city. [33]
Sviatopolk then freed the envoys and asked for peace, but could not
get it. He began at once to prepare for war on a small scale, but at
last took advice and asked aid of Monomach, who came, bringing
with him his brother. The three princes with their combined forces
attacked the Polovtsi, though Monomach urged peace, since the
enemy outnumbered them notably. The Russians were beaten in a
savage encounter and Rostislav, Monomach’s brother, was drowned
while crossing a river; Monomach himself had a narrow escape when
struggling to save him. Elated with triumph, the Polovtsi hastened
toward Kief, ravaging all before them. Sviatopolk, who had taken
refuge in the capital, summoned fresh warriors and went out to
meet the enemy a second time, but was again defeated and fled
back to Kief with but two attendants. [34]
Itlar went with his men to the stronghold to pass the night there and
was lodged at the house of Ratibor, a distinguished boyar.
Kitan remained between the outer wall and the second one, and
Monomach gave Sviatoslav, his son, to Kitan as hostage for the
safety of Itlar.
A man by the name of Slavata, who had come that day on some
mission from Sviatopolk in Kief, persuaded Ratibor to get consent
from Monomach to kill those Polovtsi. “How could I permit such a
deed?” demanded Monomach; “I have given my oath to Itlar:” “The
Polovtsi give oaths to thee, and then slay and ruin us on all sides.
That they will do this time also.” Monomach yielded after much
persuasion, and that night men were sent out who stole away
Sviatoslav and then killed Kitan with his attendants. Itlar, at Ratibor’s
house, knew nothing of what had happened. Next morning Ratibor’s
men climbed to the top of the house in which Itlar was lodging,
opened the roof and killed the Polovtsi warriors with arrows.
Sviatopolk and Monomach moved at once to the steppe against the
Polovtsi and sent to Oleg for aid in the struggle. Oleg went, but held
aloof through [35]suspicion. The two princes were successful. The
Polovtsi, taken unawares, were badly defeated. The princes seized
men, cattle, horses and camels, and returned home with rich booty.
Since Oleg and David did not come to Kief to make peace and take
counsel, the two princes marched on Smolensk. David now made
peace with them, on what terms is unknown to us, while Oleg, with
his own men and some warriors sent him by David in secret,
advanced against Murom to expel Izyaslav, son of Monomach.
Izyaslav, having a numerous force, went out to meet Oleg. “Go to
Rostoff, which belonged to thy father,” said Izyaslav, “but leave my
father’s portion.” “I wish to be here,” replied Oleg. Izyaslav now gave
battle. A fierce struggle followed, and Izyaslav fell in the fight before
the walls of Murom. The town then received Oleg, who hurried on
straightway to Suzdal, which also surrendered. Of the citizens some
he held captive while others were sent to various places in his own
land, [36]but he seized all of their property. He appeared next in front
of Rostoff, which surrendered at once, and he appointed men to
collect taxes there.
Oleg held now all lands connected with Murom. At this juncture
there came to him an envoy from Mystislav, prince in Novgorod, with
this message: “Leave Suzdal and Murom. Take not another man’s
province. I will make peace between thee and my father, even
though thou hast slain Izyaslav, my brother.”
Oleg would not listen. After such a victory he had no desire for
peace. He planned to take Novgorod, he had even sent forward his
brother, Yaroslav, and was going to assist him. Mystislav sent men,
who seized Oleg’s tax-gatherers. In view of this, Yaroslav warned
Oleg to guard himself carefully, that forces were advancing from
Novgorod. Oleg turned back to Rostoff, but Mystislav followed him.
He then left Rostoff for Suzdal; his enemy hurried straight after him.
Oleg burned Suzdal and fled to Murom. Mystislav reached Suzdal
and halted. From Suzdal he sent an envoy again to make peace, if
possible.
The battle began, and Mystislav with his Novgorod men was bearing
down heavily on the enemy, when the Polovtsi, with Monomach’s
banner above them, suddenly rushed at the flank of Oleg’s army.
Panic fell on the warriors at sight of that banner; [37]they thought
that Monomach was attacking in person, and they fled from the field
in disorder.
Seated on the same carpet, they agreed that in order to put an end
to civil war, each prince, or group of princes, should receive the land
held by his, or their father. Hence Sviatopolk received Kief with
Turoff; to Vladimir went all that Vsevolod, his father, had held,—
Smolensk with Rostoff and its settlements. Novgorod fell to
Mystislav, who had conquered Oleg; Sviatoslav’s sons, Oleg, David,
and Yaroslav, received the lands of Chernigoff. There now remained
the izgoi, or orphans, the excluded princes: David, son of Igor, with
Vassilko and Volodar, sons of Rostislav. To David was given Volynia,
or all that was left of it after the paring of land from that province.
Peremysl fell to Volodar, and Terebovl to Vassilko.
When everything was thus amicably settled, the princes kissed the
cross, and declared that if any one of them should raise hands on
another all the rest would oppose that man, and the holy cross be
against him. After that they kissed one another and parted.
Sviatopolk was willing to have the deed done, but wished to make
David entirely responsible for it. “If thou art speaking the truth,” said
he, “God Himself will be witness on thy side. If untruth, He will
judge thee.”
Vassilko consented and was on the way when a servant who met
him gave warning: “Go not, O prince,” said he; “they will seize thee.”
“God’s will be done,” replied Vassilko and making the sign of the
cross, he rode on. Upon his arrival Sviatopolk came to the door of
his palace, and greeted him with great [39]cordiality and kindness.
Then David appeared, and Vassilko was invited to breakfast with his
two kinsmen. Presently Sviatopolk withdrew, as if to give orders, and
upon some pretext David followed him. The next moment men
rushed into the room, seized Vassilko and put him in double fetters.
Sviatopolk now sought the advice of Kief boyars and the clergy. The
boyars answered evasively; the clergy took the side of Vassilko, and
begged the Kief prince to free him. Sviatopolk seemed to waver.
“This is all David’s work,” declared he, “I have no part in it.” David
interfered at once, saying: “If thou set him free, we shall not remain
princes.” “He is in thy care then,” replied Sviatopolk, and Vassilko
was given up to David, who straightway had his eyes put out.
“Thou canst not lay thy own sins on David. Not in his land, but in
thine, was the deed done,” retorted the envoys, and they left him.
David laid the guilt upon Sviatopolk. “Was it I who did the deed?”
asked he. “Was it done in my capital? I feared to be treated as was
thy brother. I was not free; I was at their mercy.” “God knows which
man of you is guilty,” said Volodar. “Give me my brother and I will
make peace with thee.” David was glad to be free of Vassilko, so
peace was declared, and they parted. That peace, however, was not
lasting, for David would not yield the towns which he had taken after
blinding Vassilko, hence the two brothers attacked him at Vsevolod.
But David escaped, shut himself in at Vladimir, and waited.
Vsevolod was stormed and burned down. As the people fled from
the blazing city, Vassilko commanded Volodar to kill them. Thus he
avenged his own wrongs upon innocent people.
Sviatopolk had promised to march against David and expel him, but
all this time he was idle; he set out only after a year, and then he
moved not directly, but to Brest on the boundary, where he made a
Polish alliance. He feared to attack single-handed and acted only
when David was beaten by Volodar and Vassilko; even then he
wished the Poles to assist him. He also made [41]an alliance with
Volodar and Vassilko, and kissed the cross to them.
David, too, went to Brest to get Polish aid, and gave fifty gold
grievens 2 to King Vladislav Herman as a present. “Help me!”
implored David. “Sviatopolk is in Brest,” said the king, “I will
reconcile thee with him.” Vladislav, however, soon discovered by
experience that the friendship of Sviatopolk brought a greater return
to him than did friendship with David. The Kief prince made richer
gifts, and to Vladislav’s son he gave his daughter in marriage. In
view of these facts, the king informed David that he had failed in
discussions with Sviatopolk. “Go home,” said he; “I will send aid if
thy cousin attacks thee.”
David went home and waited a long time. Sviatopolk laid siege to
Vladimir. David held out, hoping for Polish assistance, which came
not. At last he yielded, and the two princes made peace. David
marched out, and Sviatopolk, when he had entered Vladimir in
triumph, began to think of Volodar and Vassilko. “They are on lands
which belonged to my father,” said he, and he marched against the
two brothers, forgetting that he had kissed the cross to them
recently. He found it most difficult, however, to deal with those
princes. When he advanced to invade their lands, Volodar and
Vassilko met him promptly on the boundary. Before the battle, which
followed immediately, Vassilko held up the cross which Sviatopolk
had kissed, and cried out to him: “See what thou didst kiss to prove
thy good faith to me. Thou hast robbed me of eyesight, and now
thou art trying to kill me. Let this holy cross be between us.”
The ensuing battle was savage. Sviatopolk was forced from the field
and withdrew to Vladimir, where he put his son, Mystislav, in charge,
and sent another son, Yaroslav, to Hungary to find aid against
Volodar and Vassilko, he himself going to Kief in the meantime.
Meanwhile David fled quickly toward the steppe land to find Polovtsi.
Again he met Bonyak, who returned with him, and they captured
Lutsk and Vladimir, which David now occupied. Then he sent his
nephew, Mystislav, to the mouth of the Dnieper to seize merchants,
and thus force the Grand Prince to sue for peace as he had done
formerly—Sviatopolk, by nature weak and vacillating, had shown
that he was not the man to punish David, who was stronger now
than he had ever been before.
Volodar and Vassilko would not comply with this, and each remained
in the place which belonged to him. When the princes wished to
constrain the two brothers, Monomach would not consent to it; he
insisted on the Lubetch agreement.
Sviatopolk was greatly dissatisfied that Novgorod did not remain with
his house. As he could not take it from Monomach without
compensation, he gave Volynia in return for it. But when Mystislav,
Monomach’s son, was recalled to give place to a son of Sviatopolk,
the men of Novgorod revolted, and sent envoys to Kief with this
message: “Novgorod wishes neither Sviatopolk [44]nor his
descendants. If Sviatopolk’s son has two heads, let him come to us.”
Sviatopolk had to live without Novgorod.
The two princes sent at once to the sons of Sviatoslav, saying: “Let
us march against the Polovtsi, we shall either fall in the struggle, or
survive it.” David promised aid, but Oleg would not go. His health
was too frail, he said. Four other princes joined willingly in making
war on the steppe foes of Russia.
The Polovtsi learned what was coming, and met in council. Some
were in favor of buying peace, but the younger men called loudly for
war, and their side won the mastery.
A force was sent out to reconnoitre. The princes met this force, cut
down every man in it, advanced on the main army quickly and struck
it. A fierce struggle followed. Twenty Polovtsi chiefs were killed, and
a Khan named Beldug was captured. Beldug, when brought to
Sviatopolk, offered much ransom for his life in gold, silver, cattle,
horses and camels. Sviatopolk sent him to Monomach for judgment.
“How often hast thou sworn not to war with us,” said Monomach to
the Khan, “but still thou attack. Why not teach thy sons what an
oath is? How much Christian blood hast thou shed? But now thy own
blood be on [45]thee, not on our heads.” With that, he gave a sign to
his men, and Beldug was cut to pieces.
Immense booty was taken, and the princes went home rejoicing,
and with great glory. The terrible Bonyak was alive yet, however, and
made his power felt very keenly.
In 1106 Sviatopolk sent three voevodas against Polovtsi, whom they
defeated, and from whom they recovered much booty. In 1107, near
Pereyaslavl, Bonyak seized large herds of horses. Somewhat later he
appeared with other Khans and encamped at the Sula River near
Lubni. Sviatopolk, Monomach, and Oleg, with four other princes,
discovered his camping-ground, and, stealing up to it, made an
attack with great outcry. The Polovtsi had no time to defend
themselves. Those who could seized their beasts, mounted and fled;
those who could not mount rushed off on foot, if they were able.
The princes pursued them to the river Horol, slaying all whom they
could reach with their sabres.
Despite these successes, Oleg and David in that same year held a
meeting with two Khans, whose daughters they took as wives for
two of their sons.
This was the greatest victory won up to that time over Polovtsi. The
profit of the exploit was enormous, and the fame of it extended
through Europe. It went both to Rome and to Tsargrad. Though all
the princes helped Monomach, they could not of themselves have
conceived such a feat or have accomplished it, hence to him the
chief glory was due, and was given. For him and for the whole land
and people there was great benefit in conquering the Polovtsi. For
him, because those tribes were ever ready to [46]harass and plunder,
and doubly ready to help any prince in his projects.
The next scene of trouble was Volynia. Sviatopolk, the last Grand
Prince, had cherished good feeling toward Monomach, and had
caused Yaroslav, his son, to marry the daughter of Mystislav of
Novgorod, son of Monomach. Sometime later, however, Monomach
laid siege to Vladimir, Yaroslav’s capital. After fighting two months,
that prince asked for peace, and Monomach granted it on condition
that Yaroslav would come to him whenever summoned. The attack
on Yaroslav had been made because, in connection with Boleslav of
Poland, who had married his sister, he was acting in Galitch against
Volodar and Vassilko. Monomach had remonstrated without effect,
and then moved on Yaroslav, with the result we have just seen.
Before going on this expedition, the Grand Prince had recalled
Mystislav from Novgorod, and installed him in Bailgorod, so that in
case of need he might have that son near him. Yaroslav, who,
because his father had been Grand Prince, wished to succeed
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
textbookfull.com