This document summarizes the Geographic Support System Initiative (GSS-I) at the U.S. Census Bureau, which aims to evaluate areas for targeted address canvassing for the 2020 Census. It describes how the Census Bureau is using partner data, predictive modeling, and analysis of the MAF/TIGER database to determine targeting areas. It also outlines the FME-based software used to standardize, check for completeness and topology, and identify qualifying intersections in partner feature datasets.
Discover how you can process data in real-time by taking advantage of FME Server 2012’s event-driven architecture. You’ll learn how to accept data from sensors, feeds, devices as well as from people via email, Twitter, SMS, and more. You’ll then find out how FME’s transformation capabilities allow you to easily take action in whichever way you want – i.e. generate an alert, add information to a map, update a database, etc. Overall, you’ll see how you can instantly get data to everyone who needs it.
Leveraging Autodesk Products with FME: AutoCAD to GIS is Only the BeginningSafe Software
Go beyond AutoCAD to GIS and achieve more with FME. In this webinar you’ll learn how FME can be used to leverage several Autodesk solutions including AutoCAD Civil 3D, Revit and A360 with other applications in your organization.
How to Exchange Data between CAD and GISSafe Software
Gain total control over CAD and GIS data exchange. Discover how to use FME to preserve the information in CAD annotation when converting to GIS, and turn GIS data and attributes into rich, clean CAD drawings. You'll see how you can use reusable workflows to easily transform virtually any CAD or GIS data including AutoCAD, Esri ArcGIS, MapInfo, and MicroStation.
See FME Desktop 2017 in action. Learn how you can take advantage of the top new features, formats, and transformers to solve more data challenges even faster.
Task modeling: Understanding what people want and how to design for them.cxpartners
This document discusses task modeling and how it can be used to design interactions that fit with how people naturally work. It describes task modeling as understanding the steps users take and decisions they need to make to accomplish a goal. The document also discusses different types of research that can be done, such as observing users in their natural environments or listening to call center interactions, to understand user behaviors and needs. Finally, it provides examples of three common types of behaviors - direct connections, controlled evaluations, and complex evaluations - that should be supported in design based on task modeling research.
BIM Workflows: How to Build from CAD & GIS for InfrastructureSafe Software
BIM workflows give facilities managers, architects, and engineers key information for better-informed infrastructure planning and management. But how do you migrate to a BIM system when your current data is stored in CAD? Through a real-world international airport example, find out how CAD and engineering data can be centralized in a Document Management System (Autodesk Vault) and GIS database (SQL Server Spatial) using FME, and learn how to create BIM workflows from CAD data.
Using FME to Convert TIGER Spatial Data From Oracle Spatial To ESRI ShapefilesSafe Software
In an effort to increase the availability of Topologically Integrated Geographic Encoding and Reference (TIGER) spatial data, the Geography Division of the US Census Bureau has made increasing usage of the ESRI Shapefile format for the exchange of spatial data, both internally and for public products. To accomplish this goal, the basic conversion functionality provided by FME has been utilized in the creation of a number of shapefile products. We will discuss our basic implementation approach, which utilizes manually prepared FME mapping files that feed into a fully automated shapefile creation process and is used for the creation of most of our shapefile products. This process is used to create the Public TIGER/Line Shapefiles, as well as the shapefiles provided to local government partners for updates using the MAF/TIGER Partnership Software. We will also discuss our more recent experience using FME Workbench to design more complicated geo-processing models, which are then saved as FME mapping files and integrated into a similar, automated process. This new process is used to create the shapefiles that serve as the data source for the American Fact Finder web site.
ODTUG KSCOPE 2017 - Black Belt Techniques for FDMEE and Cloud Data ManagementFrancisco Amores
This document provides techniques for advanced data integration using Oracle's Hyperion Financial Management (HFM) and Financial Data Management Enterprise Edition (FDMEE). It discusses 25 techniques across areas like data extraction, mappings, scripting, integration with EPM applications, and automation. Examples include using member lists and functions to extract additional data, mapping based on target dimension values, running MaxL scripts via the Essbase JAPI, and enhancing the standard scheduler to allow more flexible scheduling options.
Geospatial Synergy: Amplifying Efficiency with FME & EsriSafe Software
Dive deep into the world of geospatial data management and transformation in our upcoming webinar focusing on the powerful integration of FME and Esri technologies. This insightful session comprises two compelling segments aimed at enhancing your geospatial workflows, while minimizing operational hurdles.
In the first segment, guest speaker Jan Roggisch from Locus unveils how Auckland Council triumphed over the challenges of handling large, frequent data updates on ArcGIS Online using FME. Discover the journey from manual data handling to an automated, streamlined process that reduced server downtime from minutes to seconds: setting a new standard for local government organizations.
The second segment, led by James Botterill from 1Spatial, unveils the magic of incorporating ArcPy into your FME workflows. Delve into real-world scenarios where ArcGIS geoprocessing is harmoniously orchestrated within FME using the PythonCaller. Gain insights into raster-vector data conversion, spatial analysis, and a host of practical tips and tricks that empower you to leverage the combined capabilities of FME and Esri for efficient data manipulation and conversion.
Join us to explore the remarkable possibilities that open up when FME and Esri technologies converge – enhancing your ability to manage and transform geospatial data with unprecedented efficiency.
Geospatial Synergy: Amplifying Efficiency with FME & Esri ft. Peak Guest Spea...Safe Software
Dive deep into the world of geospatial data management and transformation in our upcoming webinar focusing on the powerful integration of FME and Esri technologies. This insightful session comprises two compelling segments aimed at enhancing your geospatial workflows, while minimizing operational hurdles.
In the first segment, guest speaker Jan Roggisch from Locus unveils how Auckland Council triumphed over the challenges of handling large, frequent data updates on ArcGIS Online using FME. Discover the journey from manual data handling to an automated, streamlined process that reduced server downtime from minutes to seconds: setting a new standard for local government organizations.
The second segment, led by James Botterill from 1Spatial, unveils the magic of incorporating ArcPy into your FME workflows. Delve into real-world scenarios where ArcGIS geoprocessing is harmoniously orchestrated within FME using the PythonCaller. Gain insights into raster-vector data conversion, spatial analysis, and a host of practical tips and tricks that empower you to leverage the combined capabilities of FME and Esri for efficient data manipulation and conversion.
Join us to explore the remarkable possibilities that open up when FME and Esri technologies converge – enhancing your ability to manage and transform geospatial data with unprecedented efficiency.
Webinar here: https://youtu.be/MEmFFwNmLxg
Sumo Logic "How To" Webinar - Monitoring you Data: Alerting on Outliers
Dashboards are fantastic, but how do I get notified of critical events? This webinar will cover how to create alerts that will allow your team to effectively monitor business-critical events. Alert channels include email or webhooks into Slack, PagerDuty, DataDog, ServiceNow, or any other webhook you want to develop. What about running custom scripts triggered from alerts? Let's do it.
A sdn based application aware and network provisioningStanley Wang
The document discusses application aware SDN network provisioning. It begins with an overview of YARN architecture in Hadoop, including its benefits over earlier Hadoop architectures like improved scalability and utilization. It then discusses how SDN can be integrated with big data and cloud computing workloads by optimizing network topology and routing based on traffic patterns. Two approaches are proposed - reactive, where the SDN controller learns patterns from job logs/endpoints and modifies paths, and proactive where applications directly inform the network of intent. Finally, it proposes a service profile based SDN platform that uses network profiles and APIs to declaratively define logical topologies and provide network services and abstractions to applications.
Hadoop is an open-source framework for distributed storage and processing of large datasets across clusters of commodity hardware. It implements Google's MapReduce programming model and the Hadoop Distributed File System (HDFS) for reliable data storage. Key components include a JobTracker that coordinates jobs, TaskTrackers that run tasks on worker nodes, and a NameNode that manages the HDFS namespace and DataNodes that store application data. The framework provides fault tolerance, parallelization, and scalability.
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
More info: sumologic.com/training
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
QuickStart your Sumo Logic service with this exclusive webinar. At these monthly live events you will learn how to capitalize on critical capabilities that can amplify your log analytics and monitoring experience while providing you with meaningful business and IT insights.
Live Webinar is found here: https://youtu.be/Q1yWlInxWVs
Apache Tez is a framework for building data processing applications on top of YARN. It allows expressing a computation as a directed acyclic graph (DAG) to optimize execution. Tez improves on MapReduce by avoiding intermediate data writes to HDFS and enabling optimizations across jobs. The presentation covered Tez features like container reuse, dynamic parallelism, and integration with YARN timeline service. It also discussed ongoing work to improve performance through speculation, intermediate file formats, and shuffle optimizations.
Apache Tez is a framework for executing data processing jobs on Hadoop clusters. It allows expressing jobs as directed acyclic graphs (DAGs) which enables optimizations like running jobs as a single logical unit rather than separate MapReduce jobs. The presentation covered Tez features like container reuse, dynamic parallelism, and integration with YARN and ATS for monitoring. It also discussed ongoing work to improve performance through speculation, intermediate file formats, and shuffle optimizations, as well as better debuggability using tools like the Tez UI.
The document provides an overview and best practices for tuning an Alfresco installation for performance. It discusses disabling unused services, limiting folder hierarchies and group nesting, monitoring resources, tuning Solr indexes and caches, and using separate servers for specific tasks like indexing. General tips include testing changes thoroughly before deploying, adjusting sizing for increased usage, and following the standard performance methodology.
The document provides an overview and best practices for tuning an Alfresco installation. It discusses disabling unused services, limiting group hierarchies, monitoring resources, optimizing Solr configuration, indexing processes, and query caching. General tips include separating custom configurations, testing backups and changes, and using support tools for troubleshooting performance issues.
This document discusses run-time addressing and storage of variables in programming. It covers how variables are accessed using offsets from frames or stacks. It also discusses variable-length local data and how it can be allocated dynamically on the stack or heap. The document then covers scope, static and dynamic scoping rules, and how static links are used to access non-local variables at run-time.
Accelerated development in Automotive E/E Systems using VisualSim ArchitectDeepak Shankar
The recent trends and developments in the automotive sector towards fully autonomous diving system and vehicle to vehicle (V2V) communication would mean a drastic increase in the number of sensors, increased number of ECUs, increased concern for safety and security. This calls for the need to perform thorough evaluations on the target system architecture, at all levels - Hardware, Software and Network. During this webinar, we show how we evaluate each of these aspects of the Automotive E/E system and take a closer look at the performance, power and functional correctness of each of the auto subsystems. We will also inject faults into the demo model, which will tell us how the automotive system would perform under failure.
The webinar also showcases various Use case examples, which includes - comparison of TSN Standards, modelling of various topology, task graph modelling, glimpses into TC10 sleep-wakeup standard and integrated software.
Where Should You Deliver Database Services From?EDB
Organizations have a choice when it comes to database platform. On the one hand, the dev platforms that cloud vendors provide are fast and loaded with tools, you don’t manage the infrastructure and only pay for what you use. On the other hand, deploying on premises can mean better control over performance, security, user experience and compliance. This session discusses making those choices – not for one app, but at enterprise scale. What should you consider? Which factors say “cloud”? Which factors say “on premises”? And can you get a consumption-based experience either way?
This document provides an overview of Spark architecture and its key concepts. It begins with discussing distributed systems challenges prior to Spark and how Google File System addressed these. It then explains Spark's architecture which includes a driver program that coordinates executors running on worker nodes to process RDDs represented as a DAG. The document also compares Spark concepts like RDDs and partitions to GFS concepts like files and chunks to highlight their similarities.
Similar to FME Data Transformation for the Geographic Support System Initiative (20)
Mastering MicroStation DGN: How to Integrate CAD and GISSafe Software
Dive deep into the world of CAD-GIS integration and elevate your workflows to nexl-level efficiency levels. Discover how to seamlessly transfer data between Bentley MicroStation and leading GIS platforms, such as Esri ArcGIS.
This session goes beyond mere CAD/GIS conversion, showcasing techniques to precisely transform MicroStation elements including cells, text, lines, and symbology. We’ll walk you through tags versus item types, and understanding how to leverage both. You’ll also learn how to reproject to any coordinate system. Finally, explore cutting-edge automated methods for managing database links, and delve into innovative strategies for enabling self-serve data collection and validation services.
Join us to overcome the common hurdles in CAD and GIS integration and enhance the efficiency of your workflows. This session is perfect for professionals, both new to FME and seasoned users, seeking to streamline their processes and leverage the full potential of their CAD and GIS systems.
Data Integration Basics: Merging & Joining DataSafe Software
Are you tired of dealing with data trapped in silos? Join our upcoming webinar to learn how to efficiently merge and join disparate datasets, transforming your data integration capabilities. This webinar is designed to empower you with the knowledge and skills needed to efficiently integrate data from various sources, allowing you to draw more value from your data.
With FME, merging and joining different types of data—whether it’s spreadsheets, databases, or spatial data—becomes a straightforward process. Our expert presenters will guide you through the essential techniques and best practices.
In this webinar, you will learn:
- Which transformers work best for your specific data types.
- How to merge attributes from multiple datasets into a single output.
- Techniques to automate these processes for greater efficiency.
Don’t miss out on this opportunity to enhance your data integration skills. By the end of this webinar, you’ll have the confidence to break down data silos and integrate your data seamlessly, boosting your productivity and the value of your data.
Coordinate Systems in FME 101 - Webinar SlidesSafe Software
If you’ve ever had to analyze a map or GPS data, chances are you’ve encountered and even worked with coordinate systems. As historical data continually updates through GPS, understanding coordinate systems is increasingly crucial. However, not everyone knows why they exist or how to effectively use them for data-driven insights.
During this webinar, you’ll learn exactly what coordinate systems are and how you can use FME to maintain and transform your data’s coordinate systems in an easy-to-digest way, accurately representing the geographical space that it exists within. During this webinar, you will have the chance to:
- Enhance Your Understanding: Gain a clear overview of what coordinate systems are and their value
- Learn Practical Applications: Why we need datams and projections, plus units between coordinate systems
- Maximize with FME: Understand how FME handles coordinate systems, including a brief summary of the 3 main reprojectors
- Custom Coordinate Systems: Learn how to work with FME and coordinate systems beyond what is natively supported
- Look Ahead: Gain insights into where FME is headed with coordinate systems in the future
Don’t miss the opportunity to improve the value you receive from your coordinate system data, ultimately allowing you to streamline your data analysis and maximize your time. See you there!
An Introduction to All Data Enterprise IntegrationSafe Software
Are you spending more time wrestling with your data than actually using it? You’re not alone. For many organizations, managing data from various sources can feel like an uphill battle. But what if you could turn that around and make your data work for you effortlessly? That’s where FME comes in.
We’ve designed FME to tackle these exact issues, transforming your data chaos into a streamlined, efficient process. Join us for an introduction to All Data Enterprise Integration and discover how FME can be your game-changer.
During this webinar, you’ll learn:
- Why Data Integration Matters: How FME can streamline your data process.
- The Role of Spatial Data: Why spatial data is crucial for your organization.
- Connecting & Viewing Data: See how FME connects to your data sources, with a flash demo to showcase.
- Transforming Your Data: Find out how FME can transform your data to fit your needs. We’ll bring this process to life with a demo leveraging both geometry and attribute validation.
- Automating Your Workflows: Learn how FME can save you time and money with automation.
Don’t miss this chance to learn how FME can bring your data integration strategy to life, making your workflows more efficient and saving you valuable time and resources. Join us and take the first step toward a more integrated, efficient, data-driven future!
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
The Zero-ETL Approach: Enhancing Data Agility and InsightSafe Software
In the ever-evolving landscape of data management, Zero-ETL is an approach that is reshaping how businesses handle and integrate their data. This webinar explores Zero-ETL, a paradigm shift from the traditional Extract, Transform, Load (ETL) process, offering a more streamlined, efficient, and real-time data integration method.
We will begin with an introduction to the concept of Zero-ETL, including how it allows direct access to data in its native environment and real-time data transformation, providing up-to-date information with significantly reduced data redundancy.
Next, we'll take you through several demonstrations showing how Zero-ETL can deliver real-time data and enable the free movement of data between systems. We will also discuss the various tools that support all aspects of Zero-ETL, providing attendees with an understanding of how they can adopt this innovative approach in their organizations.
Lastly, the session will conclude with an interactive Q&A segment, allowing participants to gain deeper insights into how Zero-ETL can be tailored to their specific business needs and how they can get started today.
Join us to discover how Zero-ETL can elevate your organization's data strategy.
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
Following the popularity of “Cloud Revolution: Exploring the New Wave of Serverless Spatial Data,” we’re thrilled to announce this much-anticipated encore webinar.
In this sequel, we’ll dive deeper into the Cloud-Native realm by uncovering practical applications and FME support for these new formats, including COGs, COPC, FlatGeoBuf, GeoParquet, STAC, and ZARR.
Building on the foundation laid by industry leaders Michelle Roby of Radiant Earth and Chris Holmes of Planet in the first webinar, this second part offers an in-depth look at the real-world application and behind-the-scenes dynamics of these cutting-edge formats. We will spotlight specific use-cases and workflows, showcasing their efficiency and relevance in practical scenarios.
Discover the vast possibilities each format holds, highlighted through detailed discussions and demonstrations. Our expert speakers will dissect the key aspects and provide critical takeaways for effective use, ensuring attendees leave with a thorough understanding of how to apply these formats in their own projects.
Elevate your understanding of how FME supports these cutting-edge technologies, enhancing your ability to manage, share, and analyze spatial data. Whether you’re building on knowledge from our initial session or are new to the serverless spatial data landscape, this webinar is your gateway to mastering cloud-native formats in your workflows.
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
Following the popularity of "Cloud Revolution: Exploring the New Wave of Serverless Spatial Data," we're thrilled to announce this much-anticipated encore webinar.
In this sequel, we'll dive deeper into the Cloud-Native realm by uncovering practical applications and FME support for these new formats, including COGs, COPC, FlatGeoBuf, GeoParquet, STAC, and ZARR.
Building on the foundation laid by industry leaders Michelle Roby of Radiant Earth and Chris Holmes of Planet in the first webinar, this second part offers an in-depth look at the real-world application and behind-the-scenes dynamics of these cutting-edge formats. We will spotlight specific use-cases and workflows, showcasing their efficiency and relevance in practical scenarios.
Discover the vast possibilities each format holds, highlighted through detailed discussions and demonstrations. Our expert speakers will dissect the key aspects and provide critical takeaways for effective use, ensuring attendees leave with a thorough understanding of how to apply these formats in their own projects.
Elevate your understanding of how FME supports these cutting-edge technologies, enhancing your ability to manage, share, and analyze spatial data. Whether you're building on knowledge from our initial session or are new to the serverless spatial data landscape, this webinar is your gateway to mastering cloud-native formats in your workflows.
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
Imagine a world where information flows as swiftly as thought itself, making decision-making as fluid as the data driving it. Every moment is critical, and the right tools can significantly boost your organization’s performance. The power of real-time data automation through FME can turn this vision into reality.
Aimed at professionals eager to leverage real-time data for enhanced decision-making and efficiency, this webinar will cover the essentials of real-time data and its significance. We’ll explore:
FME’s role in real-time event processing, from data intake and analysis to transformation and reporting
An overview of leveraging streams vs. automations
FME’s impact across various industries highlighted by real-life case studies
Live demonstrations on setting up FME workflows for real-time data
Practical advice on getting started, best practices, and tips for effective implementation
Join us to enhance your skills in real-time data automation with FME, and take your operational capabilities to the next level.
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
Imagine a world where information flows as swiftly as thought itself, making decision-making as fluid as the data driving it. Every moment is critical, and the right tools can significantly boost your organization's performance. The power of real-time data automation through FME can turn this vision into reality.
Aimed at professionals eager to leverage real-time data for enhanced decision-making and efficiency, this webinar will cover the essentials of real-time data and its significance. We'll explore:
FME's role in real-time event processing, from data intake and analysis to transformation and reporting
An overview of leveraging streams vs. automations
FME's impact across various industries highlighted by real-life case studies
Live demonstrations on setting up FME workflows for real-time data
Practical advice on getting started, best practices, and tips for effective implementation
Join us to enhance your skills in real-time data automation with FME, and take your operational capabilities to the next level.
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
Hiring and retaining software development talent is next to impossible for AEC firms and other industries alike.
Join us and guest speakers from HOK, a leader in the AEC industry, as they share their success in navigating the tight talent market through the use of no-code solutions and FME.
Discover how HOK approached the process of building a custom tool to automate the creation of projects and user management for Trimble Connect and ProjectSight.
Using a mix of traditional and no-code in FME, our guest speakers will reveal how the team bridged the resource gap and used the available talent pool, producing the mission-critical web app “Trajectory”.
They will also dive into details, illustrating first-hand how JSON data was used as a “glue” between two development groups.
Learn how embracing FME as a no-code solution can unlock potential within your teams, foster collaboration, and drive efficiency.
Powering Real-Time Decisions with Continuous Data StreamsSafe Software
In an era where making swift, data-driven decisions can set industry leaders apart, understanding the world of data streaming and stream processing is crucial. During this webinar, we'll explore:
Stream Processing Overview: Dive into what stream processing entails and the value it brings organizations.
Stream vs. Batch Processing: Learn the key differences and benefits of stream processing compared to traditional batch processing, highlighting the efficiency of real-time data handling.
Mastering Data Volumes: Discover strategies for effectively managing both high and low volume data streams, ensuring optimal performance.
Boosting Operational Excellence: Explore how adopting data streaming can enhance your organization's operational workflows and productivity.
Spatial Data's Role in Streams: Understand the importance of spatial data in stream processing for more informed decision-making.
Interactive Demos: Watch practical demos, from dynamic geofencing to group-based processing.
Plus, we’ll show you how you can do it without coding! Register now to take the first step towards more informed, timely, and precise decision-making for your organization.
The Critical Role of Spatial Data in Today's Data EcosystemSafe Software
In today's data-driven landscape, integrating spatial data is becoming increasingly crucial for organizations aiming to harness the full potential of their data. Spatial data offers unique insights based on location, making it a fundamental component for addressing various challenges across different sectors, including urban planning, environmental sustainability, public health, and logistics.
Our webinar delves into the indispensable role of spatial data in data management and analysis. We'll showcase how omitting spatial data from your data strategy not only weakens your data infrastructure, but also limits the depth of your insights. Through real-world case studies, we'll highlight the transformative impact of spatial data, demonstrating its ability to uncover complex patterns, trends, and relationships.
Join us for this introductory-level webinar as we explore the critical importance of spatial data integration in driving strategic decision-making processes. By the end of the webinar, you'll gain a renewed perspective on how spatial data is essential for confronting and overcoming challenges across various domains.
Cloud Revolution: Exploring the New Wave of Serverless Spatial DataSafe Software
Once in a while, there really is something new under the sun. The rise of cloud-hosted data has fueled innovation in spatial data storage, enabling a brand new serverless architectural approach to spatial data sharing. Join us in our upcoming webinar to learn all about these new ways to organize your data, and leverage data shared by others. Explore the potential of Cloud Native Geospatial Formats in your workflows with FME, as we introduce five new formats: COGs, COPC, FlatGeoBuf, GeoParquet, STAC and ZARR.
Learn from industry experts Michelle Roby from Radiant Earth and Chris Holmes from Planet about these cloud-native geospatial data formats and how they can make data easier to manage, share, and analyze. To get us started, they’ll explain the goals of the Cloud-Native Geospatial Foundation and provide overviews of cloud-native technologies including the Cloud-Optimized GeoTIFF (COG), SpatioTemporal Asset Catalogs (STAC), and GeoParquet.
Following this, our seasoned FME team will guide you through practical demonstrations, showcasing how to leverage each format to its fullest potential. Learn strategic approaches for seamless integration and transition, along with valuable tips to enhance performance using these formats in FME.
Discover how these formats are reshaping geospatial data handling and how you can seamlessly integrate them into your FME workflows and harness the explosion of cloud-hosted data.
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsSafe Software
Learn where FME meets AI in this upcoming webinar to offer you incredible time savings. This webinar is tailored to ignite imaginations and offer solutions to your data integration challenges. As the new digital era sets sail on the winds of AI, the tangibility of its integration in our daily schema is unfolding.
Segment 1, titled “AI: The Good, the Bad and the FME” by Darren Fergus of Locus, navigates through the realms of AI, scrutinizing its pervasive impact while underscoring the symbiotic potential of FME and AI. Join in an engaging demonstration as FME and ChatGPT collaboratively orchestrate a PowerPoint narrative, epitomizing the alliance of AI with human ingenuity.
In Segment 2, “Integrating GeoAI Models in FME” by Dennis Wilhelm and Dr. Christopher Britsch of con terra GmbH, the spotlight veers towards operationalizing AI in our daily tasks through FME. A practical approach to embedding GeoAI Models into FME Workspaces is unveiled, showcasing the ease of incorporating AI-driven methodologies into your FME workflows, skyrocketing productivity levels.
To follow, Segment 3, "Unleash generative AI on your terms!" by Oliver Morris of Avineon-Tensing. While the prospects of Generative AI are thrilling, security and IT reservations, especially with 'phone home' tools, are genuine concerns. However, with open-source tools, you can locally harness large language models. In this demo, we'll unravel the magic of local AI deployment and its seamless integration into an FME workspace.
Bonus! Dmitri will join us for a fourth segment to tie us off, showcasing what he has been up to this week, including using OpenAI API for texturing in FME, amoung other projects.
Join us to explore the synergy of FME and AI: opening portals to a realm of revolutionized productivity and enriched user experiences.
The Zero-ETL Approach: Enhancing Data Agility and InsightSafe Software
In the ever-evolving landscape of data management, Zero-ETL is an approach that is reshaping how businesses handle and integrate their data. This webinar explores Zero-ETL, a paradigm shift from the traditional Extract, Transform, Load (ETL) process, offering a more streamlined, efficient, and real-time data integration method.
We will begin with an introduction to the concept of Zero-ETL, including how it allows direct access to data in its native environment and real-time data transformation, providing up-to-date information with significantly reduced data redundancy.
Next, we'll take you through several demonstrations showing how Zero-ETL can deliver real-time data and enable the free movement of data between systems. We will also discuss the various tools that support all aspects of Zero-ETL, providing attendees with an understanding of how they can adopt this innovative approach in their organizations.
Lastly, the session will conclude with an interactive Q&A segment, allowing participants to gain deeper insights into how Zero-ETL can be tailored to their specific business needs and how they can get started today.
Join us to discover how Zero-ETL can elevate your organization's data strategy.
Discover practical tips and tricks for streamlining your Marketo programs from end to end. Whether you're new to Marketo or looking to enhance your existing processes, our expert speakers will provide insights and strategies you can implement right away.
Selling software today doesn’t look anything like it did a few years ago. Especially software that runs inside a customer environment. Dreamfactory has used Anchore and Ask Sage to achieve compliance in a record time. Reducing attack surface to keep vulnerability counts low, and configuring automation to meet those compliance requirements. After achieving compliance, they are keeping up to date with Anchore Enterprise in their CI/CD pipelines.
The CEO of Ask Sage, Nic Chaillan, the CEO of Dreamfactory Terence Bennet, and Anchore’s VP of Security Josh Bressers are going to discuss these hard problems.
In this webinar we will cover:
- The standards Dreamfactory decided to use for their compliance efforts
- How Dreamfactory used Ask Sage to collect and write up their evidence
- How Dreamfactory used Anchore Enterprise to help achieve their compliance needs
- How Dreamfactory is using automation to stay in compliance continuously
- How reducing attack surface can lower vulnerability findings
- How you can apply these principles in your own environment
When you do security right, they won’t know you’ve done anything at all!
Jacquard Fabric Explained: Origins, Characteristics, and Usesldtexsolbl
In this presentation, we’ll dive into the fascinating world of Jacquard fabric. We start by exploring what makes Jacquard fabric so special. It’s known for its beautiful, complex patterns that are woven into the fabric thanks to a clever machine called the Jacquard loom, invented by Joseph Marie Jacquard back in 1804. This loom uses either punched cards or modern digital controls to handle each thread separately, allowing for intricate designs that were once impossible to create by hand.
Next, we’ll look at the unique characteristics of Jacquard fabric and the different types you might encounter. From the luxurious brocade, often used in fancy clothing and home décor, to the elegant damask with its reversible patterns, and the artistic tapestry, each type of Jacquard fabric has its own special qualities. We’ll show you how these fabrics are used in everyday items like curtains, cushions, and even artworks, making them both functional and stylish.
Moving on, we’ll discuss how technology has changed Jacquard fabric production. Here, LD Texsol takes center stage. As a leading manufacturer and exporter of electronic Jacquard looms, LD Texsol is helping to modernize the weaving process. Their advanced technology makes it easier to create even more precise and complex patterns, and also helps make the production process more efficient and environmentally friendly.
Finally, we’ll wrap up by summarizing the key points and highlighting the exciting future of Jacquard fabric. Thanks to innovations from companies like LD Texsol, Jacquard fabric continues to evolve and impress, blending traditional techniques with cutting-edge technology. We hope this presentation gives you a clear picture of how Jacquard fabric has developed and where it’s headed in the future.
Project Delivery Methodology on a page with activities, deliverablesCLIVE MINCHIN
I've not found a 1 pager like this anywhere so I created it based on my experiences. This 1 pager details a waterfall style project methodology with defined phases, activities, deliverables, assumptions. There's nothing in here that conflicts with commonsense.
Generative AI technology is a fascinating field that focuses on creating comp...Nohoax Kanont
Generative AI technology is a fascinating field that focuses on creating computer models capable of generating new, original content. It leverages the power of large language models, neural networks, and machine learning to produce content that can mimic human creativity. This technology has seen a surge in innovation and adoption since the introduction of ChatGPT in 2022, leading to significant productivity benefits across various industries. With its ability to generate text, images, video, and audio, generative AI is transforming how we interact with technology and the types of tasks that can be automated.
TrustArc Webinar - Innovating with TRUSTe Responsible AI CertificationTrustArc
In a landmark year marked by significant AI advancements, it’s vital to prioritize transparency, accountability, and respect for privacy rights with your AI innovation.
Learn how to navigate the shifting AI landscape with our innovative solution TRUSTe Responsible AI Certification, the first AI certification designed for data protection and privacy. Crafted by a team with 10,000+ privacy certifications issued, this framework integrated industry standards and laws for responsible AI governance.
This webinar will review:
- How compliance can play a role in the development and deployment of AI systems
- How to model trust and transparency across products and services
- How to save time and work smarter in understanding regulatory obligations, including AI
- How to operationalize and deploy AI governance best practices in your organization
Increase Quality with User Access Policies - July 2024Peter Caitens
⭐️ Increase Quality with User Access Policies ⭐️, presented by Peter Caitens and Adam Best of Salesforce. View the slides from this session to hear all about “User Access Policies” and how they can help you onboard users faster with greater quality.
Leading Bigcommerce Development Services for Online RetailersSynapseIndia
As a leading provider of Bigcommerce development services, we specialize in creating powerful, user-friendly e-commerce solutions. Our services help online retailers increase sales and improve customer satisfaction.
Welcome to our third live UiPath Community Day Amsterdam! Come join us for a half-day of networking and UiPath Platform deep-dives, for devs and non-devs alike, in the middle of summer ☀.
📕 Agenda:
12:30 Welcome Coffee/Light Lunch ☕
13:00 Event opening speech
Ebert Knol, Managing Partner, Tacstone Technology
Jonathan Smith, UiPath MVP, RPA Lead, Ciphix
Cristina Vidu, Senior Marketing Manager, UiPath Community EMEA
Dion Mes, Principal Sales Engineer, UiPath
13:15 ASML: RPA as Tactical Automation
Tactical robotic process automation for solving short-term challenges, while establishing standard and re-usable interfaces that fit IT's long-term goals and objectives.
Yannic Suurmeijer, System Architect, ASML
13:30 PostNL: an insight into RPA at PostNL
Showcasing the solutions our automations have provided, the challenges we’ve faced, and the best practices we’ve developed to support our logistics operations.
Leonard Renne, RPA Developer, PostNL
13:45 Break (30')
14:15 Breakout Sessions: Round 1
Modern Document Understanding in the cloud platform: AI-driven UiPath Document Understanding
Mike Bos, Senior Automation Developer, Tacstone Technology
Process Orchestration: scale up and have your Robots work in harmony
Jon Smith, UiPath MVP, RPA Lead, Ciphix
UiPath Integration Service: connect applications, leverage prebuilt connectors, and set up customer connectors
Johans Brink, CTO, MvR digital workforce
15:00 Breakout Sessions: Round 2
Automation, and GenAI: practical use cases for value generation
Thomas Janssen, UiPath MVP, Senior Automation Developer, Automation Heroes
Human in the Loop/Action Center
Dion Mes, Principal Sales Engineer @UiPath
Improving development with coded workflows
Idris Janszen, Technical Consultant, Ilionx
15:45 End remarks
16:00 Community fun games, sharing knowledge, drinks, and bites 🍻
Flame emission spectroscopy is an instrument used to determine concentration of metal ions in sample. Flame provide energy for excitation atoms introduced into flame. It involve components like sample delivery system, burner, sample, mirror, slits, monochromator, filter, detector (photomultiplier tube and photo tube detector). There are many interference involved during analysis of sample like spectral interference, ionisation interference, chemical interference ect. It can be used for both quantitative and qualitative study, determine lead in petrol, determine alkali and alkaline earth metal, determine fertilizer requirement for soil.
FME Data Transformation for the Geographic Support System Initiative
1. FME Data Transformation for
the Geographic Support
System Initiative
Jay E. Spurlin
Software Architect and Development Manager for the
GSS-I Feature Source Evaluation software system
April 8, 2013
2. U.S. Census Bureau
• The Census Bureau serves as the leading
source of quality data about the nation's
people and economy. We honor privacy,
protect confidentiality, share our expertise
globally, and conduct our work openly. We
are guided on this mission by our strong and
capable workforce, our readiness to innovate,
and our abiding commitment to our
customers.
2
3. Geography Division
• The Geography Division plans, coordinates, and
administers all geographic and cartographic activities
needed to facilitate the Census Bureau's statistical
programs throughout the US and its territories. We
manage the Census Bureau's programs to
continuously update features, boundaries and
geographic entities in TIGER and the Master Address
File (MAF). We also conduct research into geographic
concepts, methods, and standards needed to facilitate
the Census Bureau's data collection and dissemination
programs.
3
4. GSS-I
• In support of the 2020 Decennial Census, the Census Bureau
is evaluating what areas should be targeted for a traditional,
on-the-ground address canvassing operation and in which
areas a traditional canvassing operation is not necessary.
• The task the Census Bureau is undertaking is determining
how to decide which areas should be considered for targeting
– GEO has evaluated the MAF/TIGER database and assigned
quality indicators to each of the census tracts
– A Targeted Address Canvassing strategy has been developed
that contains an inventory of criteria for evaluation
4
5. GSS-I
• The Geographic Partnership program is now underway.
– GEO is receiving both address and spatial data from invited partners
• This data is at the state, county, and local level.
• The data is being evaluated and integrated with the MAF/TIGER database.
• The next step is to determine what level of feedback we can give to the partners
about their data.
• GEO is also working with statisticians on predictive modeling to help
determine where to target.
• The combination of the evaluation of the current MAF/TIGER
database, the partner data, and the predictive modeling will
contribute to the recommendation on which areas of the country
should be considered for targeting.
5
6. The Geographic Partnership Program
• A partner provides a set of source files
• The source files are moved inside the Census firewall via a secure web-exchange module
• The content inventory of the files undergoes initial verification
• The files are preserved, as supplied, for later reference
• A more detailed content assessment is done, including verification the files meet the
minimum guidelines for content and metadata
• The files are prepared for automated processing, including re-projection and mapping to a
standardized schema
• A series of (mostly) automated checks is run, which provides metrics about the data in the
files
• An interactive review is conducted, in which the files and their associated metrics are
reviewed and a decision is made how to capture any new data
• Any data that are not useful for updating the MAF/TIGER database get removed from the
files
• Features or addresses are added or modified, using an automated conflate and review
process – or – an interactive update process
6
7. Feature Source Evaluation Software
• A number of MAF/TIGER spatial layers will be extracted for the extent of the partner
entity
• An analyst will use the supplied data and metadata to map the provided source
schema to a standardized schema, and the supplied road centerline file will be
converted to an ArcSDE layer, re-projected, and the name and MTFCC mappings
applied
• The feature names in the source file will be standardized to the parsed, MAF/TIGER
naming conventions
• The standardized feature names will be checked to see if any contain illegal
charactersor prohibited or generic names
• A topological check will be run, to gauge the topological stability of the source file
• A completeness / change detection check will be run to attempt to identify areas in
the source file that contain features not found in MAF/TIGER
• A comparison will be run between the universe of feature names in the source file
and the universe of feature names found in MAF/TIGER within the extent of the entity
• All intersections that meet the requirements for CE95 assessment will be identified
7
8. Previous FME Technology Architecture
• FME Workspaces were developed using FME Workbench 2012 on
desktop workstations, running 32-bit Windows XP Service Pack 3
• FME Server 2012 (FME Engine only), on batch servers running
Linux Redhat Enterprise 5 connected to a SAN (Storage Area
Network)
Linux Batch Server
Cronacle job-queueing system
Perl and shell scripts
MAF/TIGER FME Server (command line
(Oracle Shapefiles on
invocation of FME Engines)
Database) SAN
Oracle Run-Time Client
8
9. New FME Technology Architecture
• FME Workspaces are developed using FME Workbench 2012 SP3 on
desktop workstations, running 32-bit Windows XP Service Pack 3
• FME Server 2012 SP3 (FME Server Console), on batch servers running
Linux Redhat Enterprise 5
• FME Server 2012 SP3, on Windows server, with SAN (Storage Area
Network) disk(s) mounted via Samba
Linux Batch Server
Windows Web Server
Cronacle job-queueing system MAF/TIGER
Shapefiles on ArcGIS for Server (Oracle
SAN Database)
Perl and shell scripts
FME Server (full installation)
FME Server Console (remote job
submission to FME Server) ArcSDE
Oracle Run-Time Client
Geodatabase
9
11. Topology Check
• The Topology Check workspace compiles a number of topology and
tolerance based metrics:
– Gaps – endpoints within 5 meters of any line segment
– Overshoots – line segments extending less than 5 meters beyond an
intersection
– Tiny Features – features with a total length less than 5 meters
– Floating Features – features or connected sets of features that are not
connected to the rest of the road network
– Exact Duplicates – features whose geometry and name are identical to
another feature
– Coincident – features whose geometry overlaps with another feature
– Crossing – features that cross but do not intersect at a node
– Multi-part – features that consist of multiple geometry parts
– Cutbacks – features containing angles less than 25 degrees
11
12. Completeness / Change Detection Check
• The MAF/TIGER road centerline features and the
feature source file road centerline features will be
compared using and FME workspace.
• The MAF/TIGER features will be Buffered to a
distance of 15 meters, then “overlayed” with the
source file features.
• Any source file feature parts that fall outside of the
Buffer areas will be chained together, and the total
length of difference (and of each part) will be
reported as an evaluation metric.
12
13. CE95 Qualifying Intersection Identification
• Qualifying intersections must meet the
following criteria:
– Must consist of three roads (a “T” intersection)
or four roads (an “X” intersection)
– Must consist of only secondary roads or local
roads
– Must meet at 90 or 180 degree angles, with a
15 degree plus/minus tolerance
13
14. Thank You!
Questions?
For more information:
Jay E. Spurlin
jay.e.spurlin@census.gov
U.S. Census Bureau
http://www.census.gov/geo/www/gss/
Editor's Notes
I work in the Geography Division – or GEO, as we refer to it. We manage MAF/TIGER (Topologically Integrated Geographic Encoding and Reference), which isa geospatial database system. The data is stored in Oracle Spatial Topology Manager format, and is used in support of various censuses and surveys of the Census Bureau.
This is the basic set of steps through which a set of partner-supplied source files proceeds. Currently, this is a highly manual process and most of the processing is done on shapefiles using ArcGIS for Desktop.A partner provides a set of source files – this could be through a Regional Office contact, Community TIGER, or via a direct upload.The source files are moved inside the Census firewall via a secure web-exchange module.The content inventory of the files undergoes initial verification, to make sure someone has not accidentally supplied their laundry list.The files are preserved, as supplied, for later reference. This provides a re-start point, if it is ever necessary – as well as a reference against which future submissions could be compared to determine change over time.A more detailed content assessment is done, including verification the files meet the minimum guidelines for content and metadata.The files are prepared for automated processing, including re-projection and mapping to a standardized schema. The feature names are standardized to fit the parsed, MAF/TIGER naming convention, and metadata is used to derive the MAF/TIGER Feature Classification Code (or MTFCC) for each record.A series of (mostly) automated checks are run, which provide metrics about the data in the files. For addresses, this includes a range of geocoding checks and comparisons for the addresses and for the address point locations, if they were provided. For the spatial features, I’ll talk more about these checks in a moment.An interactive review is conducted, in which the files and their associated metrics are reviewed and an assessment is made as to how many new features or addresses have been supplied as well as how many attribute or shape updates. Based on this review, a decision is made about how to capture any new data – whether the data can continue through an automated update process or should be handled through an interactive update process.If the automated process is appropriate, then any data that are not useful for updating the MAF/TIGER database get removed from the files.Features or addresses are added and/or modified, using the method chosen during the interactive review - either an automated conflate and review process – or – an interactive update process.
For the purposes of this discussion, we will focus on the Feature Source Evaluation software – in contrast to the Address Source Evaluation software. There are two separate, dedicated software systems for the evaluation of spatial features and addresses, though the architecture of the GSS-I is integrated to include both. The business model, hardware and software architecture, technology architecture, and security models have been integrated; it is really only the application architectures that have been separated out – and that only because there are established, separate areas of development expertise for spatial features, geographic entities, and addresses.The list of functionality on this slide indicates the first set of functions targeted for production release at the end of March 2013. Other checks have been proposed, and will likely be added to the software at a future date.Basically, each of the pieces of functionality listed corresponds to a module in the Feature Source Evaluation software system.A number of MAF/TIGER spatial layers will be extracted for the extent of the partner entity. These will include the road centerline layer, a number of geographic entity boundaries for reference, and the topological edge layer with the primary feature name for each edge. These layers will be extracted using automated FME workspaces, but they are fairly simple and obvious – they basically just read from Oracle Spatial using an SDO_FILTER SQL query, narrow the selection with an AreaOnAreaOverlayer or Clipper, and write to an ArcSDE geodatabase, so I don’t plan to show any examples of those.An analyst will use the supplied data and metadata to map the provided source schema to a standardized schema, and the supplied road centerline file will be converted to an ArcSDE layer, re-projected if necessary, and the name and MTFCC mappings applied. We will look at some example transformers in a few minutes.The feature names in the source file will be standardized to the parsed, MAF/TIGER naming conventions. In production, this will be a Java application, but for the current, manual procedures, an FME workspace is making an HTTPFetcher call to a published web service to do the feature name standardization, with a Decelerator to keep from overloading the web service.The standardized feature names will be checked to see if any contain illegal characters or prohibited or generic names; another Java application.A topological check will be run, to gauge the topological stability of the source file. This will be accomplished using a fairly complicated FME Workspace, which we will look at in detail shortly.A completeness / change detection check will be run to attempt to identify areas in the source file that contain features not found in MAF/TIGER. This will also be accomplished using an FME Workspace, which we will also look at in a moment.A comparison will be run between the universe of feature names in the source file and the universe of feature names found in the MAF/TIGER within the extent of the entity; this will be another Java application.All intersections that meet the requirements for conducting the CE95 accuracyassessment will be identified. The CE95 accuracy value is stated as a distance in meters, and denotes the circular standard error confidence – this is stating a 95% chance each coordinate falls within that distance from “ground truth”. This is the final FME workspace that we will be looking at today.
Previously, our general technology architecture as it related to FME was very simple. FME Server was installed on our production Linux batch servers, and FME Engines were invoked via the command line from Perl scripts driven by Cronacle-based control systems.To keep things simple and better highlight the differences in architecture, the illustrations on this slide and the next depict only the production, batch configuration as it relates to FME.
The technology architecture for FME was restructured for GSS-I to support products and processes that depend on ArcGIS on Windows. The Geography Division deployment of ArcGIS for Server is limited to Windows servers, because a Linux deployment was not seen as a viable option, for various reasons. This prompted us to research and develop a new technology architecture pattern for utilizing FME. The old pattern is still in use, as well, but this new pattern will be applied for the GSS-I and several other new software systems.
One of the business functions for which we are utilizing FME is crosswalking (or transmogrification as some of our subject matter experts have taken to calling it). This mapping of each source file schema to a standard schema is configuredusing FME Workbench, and the data transformation is done using FME Server. Source schemas can – and do – vary widely. As you might imagine, the string manipulation and filter transformers come in extremely handy while doing these mappings.The example on the left shows the use of the AttributeValueMapper transformer to transform a set of road type identifiers into MAF/TIGER Feature Classification Codes.The example on the right shows the use of the StringSearcher transformer to find all instances of a street classification code that end with the digit ‘5’ – then set the MTFCC value to the code that designates the feature as a “Ramp”.
The topology check workspace uses various transformers to collect metrics about certain types of features or feature interactions in the feature source file. Please note – not all of these are technically “wrong” topologically – they are only meant to be markers for identifying general topology or network stability and to predict MAF/TIGER update behaviour. The list of metrics might shrink or grow with time, as more partner files get processed and we learn more about what situations indicate data issues or cause problem during the update of the MAF/TIGER database.{show topology workspace and explain}The road centerlines are projected to the North American Lambert Conformal Conic projection, which preserves shape (and thereby distance).
{show the change detection workspace and explain}
Please check with LFBFor the CE95 accuracy assessment, qualifying intersections must be perpendicular ‘T’ or ‘X’ intersections (plus or minus 15 degrees) on secondary and/or local roads.{show CE95 QI workspace and explain}The road type selection is accomplished using a TestFilter.The names of the attributes that contain the MTFCC code (road type) and road name are passed in via published parameters.The road centerlines are projected to the North American Lambert Conformal Conic projection, which preserves shape (and thereby angle).The TopologyBuilder is used to find all of the intersection nodes.“T” and “X” intersections are identified by counting the number of rays emanating from each node star (the number of elements in the _node_angle list).The _fme_arc_angle values are exposed with an AttributeExposer, and a composite test in a Tester transformer checks the angle ranges.The nodes are projected back to NAD83.The requirements was to create at least 200 randomly selected nodes, with the goal of assessing the accuracy of 100 of them.A RandomNumberGenerator and Sorter are used to randomly sort and output all the nodes, allowing the user to weed through as many as necessary.The CoordinateExtractor is used to expose the coordinate x and y values as attributes.The StringConcatenator is used to string together all of the road names, which were preserved from the line segments during the topology build.