This is a complete, ready-to-use system with clear levels, competencies, and progression paths.
Engineering Growth Framework
Structure Overview
Component
Purpose
6 Levels
From junior to staff+ (IC track) + parallel management track
4 Competency Pillars
Technical, Execution, Collaboration, Impact
Behavioral Anchors
Observable, measurable behaviors per level
Dual Track
Individual Contributor (IC) and Engineering Manager (EM)
Level Definitions
Individual Contributor (IC) Track
Level
Title
Typical Scope
Time at Level
L1
Junior Engineer
Tasks with guidance
1-2 years
L2
Engineer
Features independently
1-3 years
L3
Senior Engineer
Projects, mentors others
2-4 years
L4
Staff Engineer
Domain/area ownership
3-5 years
L5
Principal Engineer
Company-wide impact
4+ years
L6
Distinguished Engineer
Industry impact
5+ years
Engineering Management (EM) Track
Level
Title
Typical Scope
M3
Engineering Manager
Team (3-8 engineers)
M4
Senior EM
Multiple teams/area
M5
Director of Engineering
Department
M6
VP Engineering
Organization
The 4 Competency Pillars
Technical Excellence
Level
L1
L2
L3
L4
L5
Code Quality
Writes working code with review
Clean, tested code
Designs for maintainability
Sets technical standards
Defines org-wide patterns
Architecture
Follows existing patterns
Extends designs
Designs subsystems
Owns system architecture
Multi-system strategy
Technical Depth
1 language/framework
2-3 technologies
Deep in 1, broad in others
Deep in domain, T-shaped
Multiple deep domains
Behavioral Anchors (L3 example): Designs APIs that other teams adopt without friction Refactors legacy code without breaking production Debugs complex production issues across service boundaries
Execution
Level
L1
L2
L3
L4
L5
Delivery
Completes assigned tasks
Delivers features end-to-end
Leads project delivery
Drives multi-team initiatives
Sets org delivery standards
Estimation
Estimates own tasks
Estimates features
Estimates projects
Forecasts roadmap delivery
Strategic planning
Risk Management
Raises blockers
Mitigates own risks
Manages project risks
Anticipates systemic risks
Organizational risk strategy
Behavioral Anchors (L4 example): Breaks down ambiguous 6-month initiative into deliverable milestones Identifies dependencies 2 quarters ahead and resolves conflicts Delivers project with 20% scope reduction but 100% business value preserved
Collaboration
Level
L1
L2
L3
L4
L5
Communication
Clear in standups
Documents decisions
Influences team direction
Aligns cross-functional teams
External speaking/writing
Mentorship
Receives feedback
Mentors juniors
Formal mentorship
Scales mentorship (programs)
Industry mentorship
Conflict Resolution
Escalates issues
Resolves 1:1 conflicts
Mediates team disputes
Resolves cross-team tensions
Organizational culture
Behavioral Anchors (L3 example): Onboards 2 new engineers who both reach productivity in <1 month Writes RFCs that get adopted by default across teams Gives feedback that changes behavior without defensiveness
Impact
Level
L1
L2
L3
L4
L5
Scope
Task
Feature
Project/Team
Domain/Area
Company/Industry
Business Impact
Completes work
Measurable feature impact
Team-level metrics
Area-level outcomes
Company-level transformation
Innovation
Implements solutions
Optimizes existing
Introduces new approaches
Creates new capabilities
Disrupts industry standards
Behavioral Anchors (L5 example): Architecture decision saves $2M/year in infrastructure costs Open-source tool adopted by 500+ companies Technical strategy enables new business line
Progression Mechanics
Promotion Criteria (Must Meet All)
Sustained Performance: Operating at next level for 6+ months
Business Need: Role exists at next level (not automatic)
Scope Expansion: Actually doing the work of next level
Peer Calibration: Consistent with others at target level
Calibration Process
Quarterly:
├── Self-assessment against framework
├── Manager assessment
├── Peer feedback (360)
├── Calibration meeting (cross-manager)
└── Growth plan for gaps
L3 → M3 is NOT a promotion. It’s a lateral move to different skills.
Practical Implementation
Week 1-2: Rollout
Present framework to team
Everyone self-assesses current level
Identify gaps (individual + team)
Month 1: Calibration
Manager assessments
360 feedback collection
Level calibration across managers
Ongoing: Growth Plans
Each engineer has:
├── Current level with evidence
├── 2-3 specific competencies to develop
├── Projects/experiences to get there
├── Mentor at target level
└── Check-in every 4-6 weeks
This DAG automatically refreshes Docker ECR (Elastic Container Registry) authentication tokens in Apache Airflow. ECR tokens expire every 12 hours, so this DAG runs twice daily to ensure continuous access to your Docker registry.
Previously there was a Session object with which i can update the database. But now that method is forbidden and the only way is by creating an API user and use to update the connection.
What This DAG Does
The DAG performs three main tasks:
Extract ECR Token: Uses AWS boto3 to get a fresh authorization token from ECR
Update Docker Connection: Updates Airflow’s docker_default connection with the new token using JWT authentication
Test Connection: Validates that the updated connection works properly
AWS credentials configured (via IAM role, environment variables, or AWS credentials file)
Step-by-Step Setup
Configure Airflow Variables
Set the following Airflow Variables in the Admin UI or via CLI:
# Via Airflow CLI
airflow variables set ecr_aws_account "123456789012" # Your AWS account ID
airflow variables set ecr_aws_region_name "us-east-1" # Your ECR region
Or via Airflow UI:
Go to Admin → Variables
Add ecr_aws_account with your AWS account ID
Add ecr_aws_region_name with your ECR region
3. Create Airflow API Connection
This is required because Airflow do not allow access to the database via the session object. It is something new from Airflow 3.
The DAG will appear in the Airflow UI as refresh_docker_token_v4
7. Configure Queues and Pools
The DAG uses a systemqueue pool. Create it:
Via Airflow UI:
Go to Admin → Pools
Create a pool named systemqueue with appropriate slots (e.g., 5)
Via CLI:
airflow pools set systemqueue 5 "System maintenance tasks"
DAG Configuration
Schedule
Cron: 55 5,17 * * * (runs at 5:55 AM and 5:55 PM daily)
Timezone: UTC (adjust as needed)
Key Settings
Max Active Runs: 1 (prevents overlapping executions)
Catchup: False (doesn’t backfill missed runs)
Retries: 2 with 1-minute delay
Tags: ["airflow", "docker", "ecr"]
Verify Docker daemon is running
Check that /var/run/docker.sock is accessible
Ensure the ECR registry URL is correct
"""DAG to refresh Docker ECR authentication tokenUpdates the docker_default connection with fresh ECR credentials using JWT authenticationYou should not have your own: ~/.docker/config.json"""importbase64importloggingfromdatetimeimportdatetime,timedeltafromtypingimportAny,Dictimportboto3importrequestsfromairflow.decoratorsimportdag,taskfromairflow.hooks.baseimportBaseHookfromairflow.models.variableimportVariablefromairflow.providers.docker.hooks.dockerimportDockerHooklogger=logging.getLogger("ecr_docker_token_refresh")ecr_aws_account=Variable.get("ecr_aws_account")ecr_aws_region_name=Variable.get("ecr_aws_region_name")default_args={"retry_delay":timedelta(minutes=1),"depends_on_past":False,"retries":2,"email_on_failure":False,"email_on_retry":False,"queue":"systemqueue","pool":"systemqueue",}connection_id="docker_default"airflow_api_connection_id="airflow-api"defget_jwt_token(endpoint_url:str,username:str,password:str) ->str:"""Get JWT token from Airflow API"""auth_url=f"{endpoint_url}/auth/token"payload={"username":username,"password":password}headers={"Content-Type":"application/json"}logger.info(f"Requesting JWT token from {auth_url}")response=requests.post(auth_url,json=payload,headers=headers)response.raise_for_status() token_data = response.json()access_token=token_data.get("access_token")ifnotaccess_token:raiseValueError("No access_token found in response")logger.info("Successfully obtained JWT token")returnaccess_tokendefupdate_connection_password_with_jwt(endpoint_url:str,jwt_token:str,password:str) ->bool:"""Update connection password using JWT token with v2 bulk API"""url=f"{endpoint_url}/api/v2/connections"# First, get the current connection to preserve other fieldsget_url=f"{endpoint_url}/api/v2/connections/{connection_id}"headers={"Content-Type":"application/json","Authorization":f"Bearer {jwt_token}", }logger.info(f"Getting current connection {connection_id}")try:get_response=requests.get(get_url,headers=headers)get_response.raise_for_status() current_connection = get_response.json()logger.info(f"Current connection retrieved successfully")# Prepare bulk update payload using v2 APIpayload={"actions": [{"action":"update","entities": [{"connection_id":connection_id,"conn_type":current_connection.get("conn_type","docker"),"password":password,# This is what we're updating} ],"action_on_non_existence":"fail",} ] }logger.info(f"Updating connection {connection_id} at {url}")response=requests.patch(url,json=payload,headers=headers)response.raise_for_status() response_data = response.json()logger.info(f"Bulk update response: {response_data}")# Check if update was successfulupdate_results=response_data.get("update",{})success_count=len(update_results.get("success", []))error_count=len(update_results.get("errors", []))ifsuccess_count>0anderror_count==0:logger.info("Connection password updated successfully")returnTrue else:logger.error(f"Update failed - Success: {success_count}, Errors: {error_count}" )iferror_count>0:logger.error(f"Errors: {update_results.get('errors', [])}")returnFalseexceptrequests.exceptions.RequestExceptionase:logger.error(f"Failed to update connection: {e}")ifhasattr(e,"response") and e.response is not None:logger.error(f"Response status: {e.response.status_code}")logger.error(f"Response text: {e.response.text}")raise@dag(default_args=default_args,schedule="55 5,17 * * *",start_date=datetime.now()-timedelta(days=1),max_active_runs=1,catchup=False,tags=["airflow","docker","ecr"],dag_id="refresh_docker_token_v4",description="Refresh Docker ECR token using JWT authentication",)defrefresh_docker_token():@task(priority_weight=5, pool="systemqueue")defextract_ecr_token() ->Dict[str,Any]:"""Extract ECR authorization token using boto3"""logger.info("Starting ECR token extraction")try:logger.info(f"Connecting to ECR in region {ecr_aws_region_name}")ecr_client=boto3.client("ecr",region_name=ecr_aws_region_name)logger.info(f"Requesting authorization token for account {ecr_aws_account}")response=ecr_client.get_authorization_token(registryIds=[ecr_aws_account])auth_data=response["authorizationData"][0]token=auth_data["authorizationToken"]registry_url=auth_data["proxyEndpoint"]expires_at=auth_data["expiresAt"]logger.info("Successfully retrieved token")logger.info(f"Registry URL: {registry_url}")logger.info(f"Token expires at: {expires_at}")decoded_token=base64.b64decode(token).decode()username,password=decoded_token.split(":",1)logger.info(f"Decoded username: {username}")return{"registry_url":registry_url,"username":username,"password":password,"expires_at":expires_at.isoformat(),"raw_token":token, }exceptExceptionase:logger.error(f"Failed to extract ECR token: {str(e)}")raise@task(priority_weight=5, pool="systemqueue")defupdate_docker_connection(token_data:Dict[str,Any]) ->str:"""Update Docker connection using JWT authentication"""logger.info("Starting Docker connection update using JWT authentication")logger.info("Token data received from previous task")try:# Get Airflow API connection detailslogger.info(f"Retrieving Airflow API connection: {airflow_api_connection_id}" )api_connection=BaseHook.get_connection(airflow_api_connection_id)endpoint_url=f"{api_connection.schema}://{api_connection.host}"ifapi_connection.port:endpoint_url+=f":{api_connection.port}"username=api_connection.loginpassword=api_connection.passwordlogger.info(f"Using endpoint: {username} @ {endpoint_url}")jwt_token=get_jwt_token(endpoint_url,username,password)success=update_connection_password_with_jwt(endpoint_url,jwt_token,token_data["password"])ifsuccess:return"SUCCESS: Docker connection updated successfully using JWT authentication" else:raiseException("Failed to update connection")exceptExceptionase:logger.error(f"Failed to update Docker connection: {str(e)}")raise@task(priority_weight=3, pool="systemqueue")deftest_docker_connection() ->str:"""Test the updated Docker connection"""logger.info("Testing DockerHook...")try:# First get the connection details to debugconnection=BaseHook.get_connection("docker_default")docker_hook=DockerHook(docker_conn_id="docker_default",base_url="unix://var/run/docker.sock")logger.info("DockerHook created successfully")# Try to get docker client (this will test the connection more thoroughly)docker_client=docker_hook.get_conn()logger.info("Docker client connection established",docker_client.version())return"SUCCESS: Docker connection tested and working with DockerHook"exceptExceptionasclient_error:logger.error(f"Docker client test failed: {client_error}")# Show connection properties on failuretry:connection=BaseHook.get_connection("docker_default")exceptExceptionasconn_error:logger.error(f"Could not retrieve connection properties: {conn_error}")returnf"FAILED: Docker connection test failed: {client_error}"# Task flowtoken_data=extract_ecr_token()update_result=update_docker_connection(token_data)test_result=test_docker_connection()token_data>>update_result>>test_resultrefresh_docker_token_dag=refresh_docker_token()
This article would be not possible without https://gorails.com/guides/upgrading-postgresql-version-on-ubuntu-server https://www.directedignorance.com/blog/upgrading-postgresql-14-to-16-on-ubuntu
I know this article contains a lot of text, but trust me, it’s absolutely worth reading—you’ll become much more productive!
Fixing Incorrect File Paths
Use Case: You encounter an incorrect file path and need to locate the issue. Instead of scrutinizing the path segment by segment, a more effective approach is to list the path and start trimming it from the end until you find the correct segment.
This method will save you mental effort and reduce eye strain.
Example: You receive an error when attempting to open the following file: /home/user/projects/pizza/seed/images/themes/pizza/01.jpg
If you need to locate something, use the “Find” shortcut instead of scrolling and reading. The “Find” command is much faster and allows you to search for variables, class names, or even partial names.
Goto Line Approximately
If your error is on line 459, you can quickly navigate there using a shortcut. Simply type a number close to 459, such as 450, and you’ll instantly see line 459 along with the numbered lines around it.
Use Code Folding
Use Case: To minimize distractions, use code folding to hide parts of the code you’re not currently working on.
Example:
In most code editors, you can collapse code blocks by clicking the small arrow next to the line numbers. This helps you focus on the part of the code you’re currently working on.
Comparing two branches
To compare two branches you can use git diff …branch-name but this will require a lot of effort.
Good way to deal with that is by cloning another copy of the repo and having two repositories locally.
Usually I have “project-name” repositor folder and “project-name-other” repository name.
Then when I want to compare I use some GUI to do the job. Mine is meld.
meldproject-name/project-name-other/
Change directory
Sometimes you want to change directory of a file. My way of doing that is to copy the current opened file path from the editor with a shortcut. Grab the whole file path in the clipboard and then do cdf … like that
cdf(){if[$#-eq0];thenecho"No file path provided."return1fi# Join all arguments with spaceslocalfull_path="$*"if[-f"$full_path"];then# If it's a file, extract the directory pathlocaldir_pathdir_path=$(dirname"$full_path")elif[-d"$full_path"];then# If it's a directory, use it directlylocaldir_path="$full_path"elseecho"The path provided is neither a file nor a directory."return1fi# Change to the directorycd"$dir_path"||{echo"Failed to change directory to $dir_path"return1}}
and to glue everything you need to switch to environment variables in the dbt_project.yml. Note that this feature is is not supported on old dbt versions. In the documentation it is written that we can use environment variables in dbt_project.yml
eXtreme Go Horse :: #methodologies, #architecture haha article https://medium.com/@noriller/sprints-the-biggest-mistake-of-software-engineering-34115e7de008
wireguard readings :: #wireguard how to fix wireguard connection by changing mtu https://keremerkan.net/posts/wireguard-mtu-fixes/ collection of wireguard docs and tools – https://github.com/pirate/wireguard-docs