About Kilroy Realty Corporation
- Kilroy (NYSE: KRC) is a leading U.S. landlord and developer with approximately 17 million square feet of primarily office and life science space with operations in San Diego, Greater Los Angeles, the San Francisco Bay Area, Greater Seattle and Austin, Texas.
- Kilroy’s Vision is to be a premier and sustainable commercial real estate operator and developer, sought after by tenants, preferred by investors, and respected by competitors.
- Kilroy’s mission is to create and operate exceptional real estate assets where people live, work, and engage with their communities.
- At Kilroy, our people are our greatest resource. We strive to maintain a culture of continuous growth, recognizing and rewarding performance in all facets of the company. Integrity matters, which means how we operate is just as important as what we deliver. We aim to operate as a cohesive team and believe that diversity of thought and perspective results in better outcomes.
About the Opportunity
The Senior Data Engineer will architect, build, and optimize mission-critical data pipelines to support enterprise analytics. This role involves working with structured, semi-structured, and unstructured data to build scalable, automated data solutions. You will be responsible for designing, implementing, and maintaining data models and ETL processes that transform raw data from source systems into actionable insights within data marts and the enterprise data warehouse. This individual is expected to have a deep understanding of how data flows between platforms in order to maximize the business value of data. This position reports to the Senior Director, Enterprise Applications and collaborates closely with the Manager of Information Systems to drive strategic data initiatives.
Opportunity Requirements
- Degree in Business with emphasis on Computer Technology/Information Systems OR in Business/Operations with a strong interest in IT; or equivalent combination of education
- 10+ years of experience in data engineering, ETL development, or enterprise data architecture.
- Deep expertise in Azure cloud data services, including Azure Data Factory, Synapse Analytics, Microsoft Fabric, and Databricks.
- Familiarity with DevOps, CI/CD (Azure DevOps), containerization (Docker, Kubernetes), and Terraform for data infrastructure automation.
- Working knowledge of BI/Reporting Tools (Power BI, Tableau, Looker) and API-based data integrations.
- Strong proficiency in Data Models, SQL, and Python with a focus on performance tuning and distributed computing.
- Extensive experience building and optimizing ETL/ELT pipelines for large-scale data processing.
- Strong understanding of data modeling (star schema, snowflake schema) and database optimization techniques.
- Experience with real-time and batch data processing architectures, including Kafka, Spark Streaming, or Azure Event Hub.
- Hands-on experience with data security, governance, and compliance frameworks.
- Ability to translate complex business requirements into scalable data solutions, collaborating with both technical and non-technical stakeholders.
- Familiarity with version control systems like Git
- Understanding of SOX compliance and audit procedures preferred
- Willingness to be a team player and handle a variety of tasks that vary in complexity
- Strong attention to detail with project management experience and good organizational skills
- Ability to adapt and understand priorities
- Self-starter with strong organizational skills and the ability to manage multiple tasks
Preferred Qualifications
- Experience in real estate, financial services, or investment industries.
- Familiarity with serverless computing (AWS Lambda, Azure Functions) for event-driven data processing.
- Experience with data governance and cataloging tools such as Microsoft Purview, including metadata management, data lineage tracking, and access control to ensure compliance and data discoverability.
Summary of Responsibilities
The core responsibilities of this position include, but are not limited to the following:
- Collaborate with other departments and technical team to develop requirements and scope of reports or data projects.
- Design, develop, and maintain ETL/ELT pipelines to ingest, transform, and load data from various sources into an Azure environment, with a focus on Microsoft Fabric
- Automate and optimize data workflows for scalability and efficiency.
- Connect disparate data sources, such as databases, APIs, cloud platforms, and on-premise systems, into a unified architecture.
- Collaborate with internal teams to integrate data into analytical tools such as Power BI.
- Build and maintain data warehouses, data lakes, or other storage solutions (e.g., Microsoft Fabric Lakehouse, Microsoft Fabric Warehouse).
- Ensure the accuracy, integrity, and consistency of data across all systems.
- Implement and maintain data security, compliance, and privacy protocols.
- Establish and enforce best practices for data management, quality, and documentation.
- Provide support for data-related issues and optimize performance for reporting.
- Monitor and improve the performance of databases and data pipelines.
- Troubleshoot bottlenecks and resolve issues in data processing workflows.
- Stay updated on emerging technologies and best practices in data engineering.
- Propose and implement innovative solutions to improve the organization’s data infrastructure.
- Architect and build large-scale ETL/ELT pipelines using Azure Data Factory, Microsoft Fabric Dataflows, and Databricks notebooks.
- Develop advanced data models for data warehouses, datamarts, and data lakes, ensuring scalability and efficiency.
- Develop and maintain scalable CI/CD pipelines and infrastructure-as-code (IaC) methodologies.
- Integrate data across disparate sources, including APIs, databases, streaming platforms, cloud services, and on-prem systems.
- Develop real-time and batch data processing solutions, enabling advanced AI, machine learning, and analytics applications.
- Ensure high availability and reliability of enterprise data platforms, proactively addressing latency and performance bottlenecks.
What we offer
At Kilroy, base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within the role. The base pay range for this role is between $164,00 to $195,000 and your base pay will depend on your skills, experience and training, knowledge, licensure and certifications, and other business and organizational needs. It is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. This role is eligible for an annual discretionary bonus as well.
Our comprehensive group health benefits program is built around your total health and provides employees and their families with care and coverage designed to help you thrive. Our health and wellness program offerings include medical, dental, vision, with FSA, HSA options, Group Life & Disability, LTD coverage and much more. Ancillary programs include a retirement savings plan with a competitive employer match, employee support programs like our parental leave coaching program, wellness, and commuter benefits, just to name a few. We invite you to visit our website at www.kilroyrealty.com to learn more.
Apply for this Position
To apply, email recruiting@recruiting.kilroyrealty.com. Please include the position title in the subject line. No phone calls, please.
EEO/AA/M/F/Vet/Disability Employer.