Rethinking Israel: Studies in the History and Archaeology of Ancient Israel in Honor of Israel Finkelstein, 2017
We deal with the general issue of handling statistical data in archaeology for the purpose of ded... more We deal with the general issue of handling statistical data in archaeology for the purpose of deducing sound, justified conclusions. The employment of various quantitative and statistical methods in archaeological practice has existed from its beginning as a systematic discipline in the 19th century (Drower 1995). Since this early period, the focus of archaeological research has developed and shifted several times. The last phase in this process, especially common in recent decades, is the proliferation of collaboration with various branches of the exact and natural sciences. Many new avenues of inquiry have been inaugurated, and a wealth of information has become available to archaeologists. In our view, the plethora of newly obtained data requires a careful reexamination of existing statistical approaches and a restatement of the desired focus of some archaeological investigations. We are delighted to dedicate this article to Israel Finkelstein, our teacher, adviser, colleague, and friend, who is one of the father figures of this ongoing scientific revolution in archaeology (e.g., Finkelstein and Piasetzky 2010, Finkelstein et al. 2012, 2015), and wish him many more fruitful years of research.
Proceedings of the 4th International Workshop on Historical Document Imaging and Processing (HIP 2017), 2017
The problem of finding a prototype for typewritten or handwritten characters belongs to a family ... more The problem of finding a prototype for typewritten or handwritten characters belongs to a family of "shape prior" estimation problems. In epigraphic research, such priors are derived manually, and constitute the building blocks of "paleographic tables". Suggestions for automatic solutions to the estimation problem are rare in both the Computer Vision and the OCR/Handwriting Text Recognition communities. We review some of the existing approaches, and propose a new robust scheme, suitable for the challenges of degraded historical documents. This fast and easy to implement method is employed for ancient Hebrew inscriptions dated to the First Temple period.
Alphabets, Texts and Artefacts in the Ancient Near East: Studies Presented to Benjamin Sass, 2016
Our research team enjoyed the privilege of collaborating with Benjamin Sass over a period of seve... more Our research team enjoyed the privilege of collaborating with Benjamin Sass over a period of several years. We are happy to dedicate this article to him and wish to express our gratitude for what has been both a prodigious and enjoyable experience. The purpose of our joint endeavor has been the introduction of modern techniques from computer science and physics to the realm of Iron Age epigraphy. One of the most important issues addressed during our cooperation was the topic of facsimile creation. Facsimile creation is a necessary preliminary step in the process of deciphering and analyzing ancient inscriptions. Several manual facsimile construction techniques are currently in use: drawing upon collation of the artifact; outlining on transparent paper overlaid on a photograph of the inscription; and computer-aided depiction via software such as Adobe Photoshop, Adobe Illustrator, Gimp or Inkscape (see Summary section below for software web links). Despite their importance for the field of epigraphy, little attention has thus far been devoted to the methodology of facsimile creation (though the recent comprehensive treatment by Parker and Rollston 2016). Recent decades have seen rapid development and consolidation of various computerized image processing algorithms. Among the most basic and popular tasks in this field is the creation of a black-and-white version of a given image, denoted as image binarization (see Fig.1a–b). Often, such a binarized image is used as a first step for further image processing missions, such as Optical Character Recognition (OCR), texts digitization and text analysis tasks. An algorithmic creation of binarizations can therefore be seen as another method of facsimile creation. Furthermore, a relatively new sub-domain of image processing, Historical Imaging and Processing (HIP), specializes in handling antique documents of different types, periods and origins. Accordingly, binarization algorithms stemming from HIP are even more suitable for archaeological purposes.
Advances in Visual Computing, Lecture Notes in Computer Science 10072, 2016
Chan-Vese is an important and well-established segmentation method. However, it tends to be chall... more Chan-Vese is an important and well-established segmentation method. However, it tends to be challenging to implement, including issues such as initialization problems and establishing the values of several free parameters. The paper presents a detailed analysis of Chan-Vese framework. It establishes a relation between the Otsu binarization method and the fidelity terms of Chan-Vese energy functional, allowing for intelligent initialization of the scheme. An alternative, fast, and parameter-free morphological segmentation technique is also suggested. Our experiments indicate the soundness of the proposed algorithm.
Proceedings of the 15th International Conference on Frontiers in Handwriting Recognition, 2016
This article discusses the quality assessment of binary images. The customary, ground truth based... more This article discusses the quality assessment of binary images. The customary, ground truth based methodology, used in the literature is shown to be problematic due to its subjective nature. Several previously suggested alternatives are surveyed and are also found to be inadequate in certain scenarios. A new approach, quantifying the adherence of a binarization to its document image is proposed and tested using six different measures of accuracy. The measures are evaluated experimentally based on datasets from DIBCO and H-DIBCO competitions, with respect to different kinds of binarization degradations.
2012 10th IAPR International Workshop on Document Analysis Systems, 2012
The discipline of First Temple Period epigraphy (the study of writing) relies heavily on manually... more The discipline of First Temple Period epigraphy (the study of writing) relies heavily on manually-drawn facsimiles (black and white images) of ancient inscriptions. This practice may unintentionally mix up documentation and interpretation. The article proposes a new method for evaluating the quality of the facsimile. It is based on a measure, comparing the image of the inscription to the registered facsimile. Some empirical results, supporting the methodology, are presented. The technique is also relevant to quality evaluation of other types of facsimiles and binarization in general.
Proceedings of the National Academy of Sciences, 2016
The relationship between the expansion of literacy in Judah and composition of biblical texts has... more The relationship between the expansion of literacy in Judah and composition of biblical texts has attracted scholarly attention for over a century. Information on this issue can be deduced from Hebrew inscriptions from the final phase of the first Temple period. We report our investigation of 16 inscriptions from the Judahite desert fortress of Arad, dated ca. 600 BCE—the eve of Nebuchadnezzar’s destruction of Jerusalem. The inquiry is based on new methods for image processing and document analysis, as well as machine learning algorithms. These techniques enable identification of the minimal number of authors in a given group of inscriptions. Our algorithmic analysis, complemented by the textual information, reveals a minimum of six authors within the examined inscriptions. The results indicate that in this remote fort literacy had spread throughout the military hierarchy, down to the quartermaster and probably even below that rank. This implies that an educational infrastructure that could support the composition of literary texts in Judah already existed before the destruction of the first Temple. A similar level of literacy in this area is attested again only 400 y later, ca. 200 BCE.
Proceedings of the 16th International Graphonomics Society Conference, 2013
A binarization of challenging historical inscription is improved by means of sparse methods. The ... more A binarization of challenging historical inscription is improved by means of sparse methods. The approximation is based on a binary dictionary learned by k-medians and k-medoids algorithms from a clear source. Some preliminary results show superiority to the existing binarization with respect to fine features such as strokes continuity, deviations from a straight line, edge noise and the presence of stains. The k-medians dictionary-learning scheme shows a robust behavior when initial patches database is reduced.
Preconditioners for hyperbolic systems are numerical artifacts to accelerate the convergence to a... more Preconditioners for hyperbolic systems are numerical artifacts to accelerate the convergence to a steady state. In addition, the preconditioner should also be included in the artificial viscosity or upwinding terms to improve the accuracy of the steady state solution. For time dependent problems we use a dual time stepping approach. The preconditioner affects the convergence rate and the accuracy of the subiterations within each physical time step. We consider two types of local preconditioners: Jacobi and low speed preconditioning. We can express the algorithm in several sets of variables while using only the conservation variables for the flux terms. We compare the effect of these various variable sets on the efficiency and accuracy of the scheme.
We consider the steady state equations for a compressible fluid. For low speed flow the system is... more We consider the steady state equations for a compressible fluid. For low speed flow the system is stiff since the ratio of the convective speed to the speed of sound is very small. To overcome this difficulty we alter the time dependency of the equations while retaining the same steady state operator. In order to achieve high numerical resolution we also alter the artificial dissipation (or Roe matrix) of the numerical scheme. The definition of preconditioners and artificial dissipation terms can be formulated conveniently by using other sets of dependent variables rather than the conservation variables. The effects of different preconditioners, artificial dissipation and grid density on accuracy and convergence to the steady state of the numerical solutions are presented in detail. The numerical results obtained for inviscid and viscous two-and three-dimensional flows over external aerodynamic bodies indicate that efficient multigrid computations of flows with very low Mach numbers are now possible. 0 1997 Elsevier Science Ltd.
Three explicit multigrid methods, Ni's method, Jameson's finite-volume method, and a finite-diffe... more Three explicit multigrid methods, Ni's method, Jameson's finite-volume method, and a finite-difference method based on Brandt's work, are described and compared for two model problems. All three methods use an explicit multistage Runge-Kutta scheme on the fine grid, and this scheme is also described. Convergence histories for inviscid flow over a bump in a channel for the fine-grid scheme alone show that convergence rate is proportional to Courant number and that implicit residual smoothing can significantly accelerate the scheme. Ni's method was slightly slower than the implicitly-smoothed scheme alone. Brandt's and Jameson's methods are shown to be equivalent in form but differ in their node versus cell-centered implementations. They are about 8.5 times faster than Ni's method in terms of CPU time. Results for an oblique shock/boundary layer interaction problem verify the accuracy of the finite-difference code. All methods slowed considerably on the stretched viscous grid but Brandt's method was still 2.1 times faster than Ni's method.
Rethinking Israel: Studies in the History and Archaeology of Ancient Israel in Honor of Israel Finkelstein, 2017
We deal with the general issue of handling statistical data in archaeology for the purpose of ded... more We deal with the general issue of handling statistical data in archaeology for the purpose of deducing sound, justified conclusions. The employment of various quantitative and statistical methods in archaeological practice has existed from its beginning as a systematic discipline in the 19th century (Drower 1995). Since this early period, the focus of archaeological research has developed and shifted several times. The last phase in this process, especially common in recent decades, is the proliferation of collaboration with various branches of the exact and natural sciences. Many new avenues of inquiry have been inaugurated, and a wealth of information has become available to archaeologists. In our view, the plethora of newly obtained data requires a careful reexamination of existing statistical approaches and a restatement of the desired focus of some archaeological investigations. We are delighted to dedicate this article to Israel Finkelstein, our teacher, adviser, colleague, and friend, who is one of the father figures of this ongoing scientific revolution in archaeology (e.g., Finkelstein and Piasetzky 2010, Finkelstein et al. 2012, 2015), and wish him many more fruitful years of research.
Proceedings of the 4th International Workshop on Historical Document Imaging and Processing (HIP 2017), 2017
The problem of finding a prototype for typewritten or handwritten characters belongs to a family ... more The problem of finding a prototype for typewritten or handwritten characters belongs to a family of "shape prior" estimation problems. In epigraphic research, such priors are derived manually, and constitute the building blocks of "paleographic tables". Suggestions for automatic solutions to the estimation problem are rare in both the Computer Vision and the OCR/Handwriting Text Recognition communities. We review some of the existing approaches, and propose a new robust scheme, suitable for the challenges of degraded historical documents. This fast and easy to implement method is employed for ancient Hebrew inscriptions dated to the First Temple period.
Alphabets, Texts and Artefacts in the Ancient Near East: Studies Presented to Benjamin Sass, 2016
Our research team enjoyed the privilege of collaborating with Benjamin Sass over a period of seve... more Our research team enjoyed the privilege of collaborating with Benjamin Sass over a period of several years. We are happy to dedicate this article to him and wish to express our gratitude for what has been both a prodigious and enjoyable experience. The purpose of our joint endeavor has been the introduction of modern techniques from computer science and physics to the realm of Iron Age epigraphy. One of the most important issues addressed during our cooperation was the topic of facsimile creation. Facsimile creation is a necessary preliminary step in the process of deciphering and analyzing ancient inscriptions. Several manual facsimile construction techniques are currently in use: drawing upon collation of the artifact; outlining on transparent paper overlaid on a photograph of the inscription; and computer-aided depiction via software such as Adobe Photoshop, Adobe Illustrator, Gimp or Inkscape (see Summary section below for software web links). Despite their importance for the field of epigraphy, little attention has thus far been devoted to the methodology of facsimile creation (though the recent comprehensive treatment by Parker and Rollston 2016). Recent decades have seen rapid development and consolidation of various computerized image processing algorithms. Among the most basic and popular tasks in this field is the creation of a black-and-white version of a given image, denoted as image binarization (see Fig.1a–b). Often, such a binarized image is used as a first step for further image processing missions, such as Optical Character Recognition (OCR), texts digitization and text analysis tasks. An algorithmic creation of binarizations can therefore be seen as another method of facsimile creation. Furthermore, a relatively new sub-domain of image processing, Historical Imaging and Processing (HIP), specializes in handling antique documents of different types, periods and origins. Accordingly, binarization algorithms stemming from HIP are even more suitable for archaeological purposes.
Advances in Visual Computing, Lecture Notes in Computer Science 10072, 2016
Chan-Vese is an important and well-established segmentation method. However, it tends to be chall... more Chan-Vese is an important and well-established segmentation method. However, it tends to be challenging to implement, including issues such as initialization problems and establishing the values of several free parameters. The paper presents a detailed analysis of Chan-Vese framework. It establishes a relation between the Otsu binarization method and the fidelity terms of Chan-Vese energy functional, allowing for intelligent initialization of the scheme. An alternative, fast, and parameter-free morphological segmentation technique is also suggested. Our experiments indicate the soundness of the proposed algorithm.
Proceedings of the 15th International Conference on Frontiers in Handwriting Recognition, 2016
This article discusses the quality assessment of binary images. The customary, ground truth based... more This article discusses the quality assessment of binary images. The customary, ground truth based methodology, used in the literature is shown to be problematic due to its subjective nature. Several previously suggested alternatives are surveyed and are also found to be inadequate in certain scenarios. A new approach, quantifying the adherence of a binarization to its document image is proposed and tested using six different measures of accuracy. The measures are evaluated experimentally based on datasets from DIBCO and H-DIBCO competitions, with respect to different kinds of binarization degradations.
2012 10th IAPR International Workshop on Document Analysis Systems, 2012
The discipline of First Temple Period epigraphy (the study of writing) relies heavily on manually... more The discipline of First Temple Period epigraphy (the study of writing) relies heavily on manually-drawn facsimiles (black and white images) of ancient inscriptions. This practice may unintentionally mix up documentation and interpretation. The article proposes a new method for evaluating the quality of the facsimile. It is based on a measure, comparing the image of the inscription to the registered facsimile. Some empirical results, supporting the methodology, are presented. The technique is also relevant to quality evaluation of other types of facsimiles and binarization in general.
Proceedings of the National Academy of Sciences, 2016
The relationship between the expansion of literacy in Judah and composition of biblical texts has... more The relationship between the expansion of literacy in Judah and composition of biblical texts has attracted scholarly attention for over a century. Information on this issue can be deduced from Hebrew inscriptions from the final phase of the first Temple period. We report our investigation of 16 inscriptions from the Judahite desert fortress of Arad, dated ca. 600 BCE—the eve of Nebuchadnezzar’s destruction of Jerusalem. The inquiry is based on new methods for image processing and document analysis, as well as machine learning algorithms. These techniques enable identification of the minimal number of authors in a given group of inscriptions. Our algorithmic analysis, complemented by the textual information, reveals a minimum of six authors within the examined inscriptions. The results indicate that in this remote fort literacy had spread throughout the military hierarchy, down to the quartermaster and probably even below that rank. This implies that an educational infrastructure that could support the composition of literary texts in Judah already existed before the destruction of the first Temple. A similar level of literacy in this area is attested again only 400 y later, ca. 200 BCE.
Proceedings of the 16th International Graphonomics Society Conference, 2013
A binarization of challenging historical inscription is improved by means of sparse methods. The ... more A binarization of challenging historical inscription is improved by means of sparse methods. The approximation is based on a binary dictionary learned by k-medians and k-medoids algorithms from a clear source. Some preliminary results show superiority to the existing binarization with respect to fine features such as strokes continuity, deviations from a straight line, edge noise and the presence of stains. The k-medians dictionary-learning scheme shows a robust behavior when initial patches database is reduced.
Preconditioners for hyperbolic systems are numerical artifacts to accelerate the convergence to a... more Preconditioners for hyperbolic systems are numerical artifacts to accelerate the convergence to a steady state. In addition, the preconditioner should also be included in the artificial viscosity or upwinding terms to improve the accuracy of the steady state solution. For time dependent problems we use a dual time stepping approach. The preconditioner affects the convergence rate and the accuracy of the subiterations within each physical time step. We consider two types of local preconditioners: Jacobi and low speed preconditioning. We can express the algorithm in several sets of variables while using only the conservation variables for the flux terms. We compare the effect of these various variable sets on the efficiency and accuracy of the scheme.
We consider the steady state equations for a compressible fluid. For low speed flow the system is... more We consider the steady state equations for a compressible fluid. For low speed flow the system is stiff since the ratio of the convective speed to the speed of sound is very small. To overcome this difficulty we alter the time dependency of the equations while retaining the same steady state operator. In order to achieve high numerical resolution we also alter the artificial dissipation (or Roe matrix) of the numerical scheme. The definition of preconditioners and artificial dissipation terms can be formulated conveniently by using other sets of dependent variables rather than the conservation variables. The effects of different preconditioners, artificial dissipation and grid density on accuracy and convergence to the steady state of the numerical solutions are presented in detail. The numerical results obtained for inviscid and viscous two-and three-dimensional flows over external aerodynamic bodies indicate that efficient multigrid computations of flows with very low Mach numbers are now possible. 0 1997 Elsevier Science Ltd.
Three explicit multigrid methods, Ni's method, Jameson's finite-volume method, and a finite-diffe... more Three explicit multigrid methods, Ni's method, Jameson's finite-volume method, and a finite-difference method based on Brandt's work, are described and compared for two model problems. All three methods use an explicit multistage Runge-Kutta scheme on the fine grid, and this scheme is also described. Convergence histories for inviscid flow over a bump in a channel for the fine-grid scheme alone show that convergence rate is proportional to Courant number and that implicit residual smoothing can significantly accelerate the scheme. Ni's method was slightly slower than the implicitly-smoothed scheme alone. Brandt's and Jameson's methods are shown to be equivalent in form but differ in their node versus cell-centered implementations. They are about 8.5 times faster than Ni's method in terms of CPU time. Results for an oblique shock/boundary layer interaction problem verify the accuracy of the finite-difference code. All methods slowed considerably on the stretched viscous grid but Brandt's method was still 2.1 times faster than Ni's method.
The authors present a new method of writer identification, employing the full power of multiple e... more The authors present a new method of writer identification, employing the full power of multiple experiments, which yields a statistically significant result. Each individual binarized and segmented character is represented as a histogram of 512 binary pixel patterns—3 × 3 black and white patches. In the process of comparing two given inscriptions under a "single author" assumption, the algorithm performs a Kolmogorov–Smirnov test for each letter and each patch. The resulting p-values are combined using Fisher's method, producing a single p-value. Experiments on both Modern and Ancient Hebrew data sets demonstrate the excellent performance and robustness of this approach.
Uploads
Papers by Eli Turkel