You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The impact for this learning module is that the command hangs forever and for OpenShift beginners it is hard to identify the root cause and to find a workaround.
I resolved it for myself by creating my own container image as described in the issue above.
The following command is hanging forever:
$ oc run -it --rm telnet --image=jess/telnet --restart=Never mcrouter 5000
Here are the logs:
$ oc get pods
NAME READY STATUS RESTARTS AGE
mcrouter-7656564544-76fw7 1/1 Running 0 113s
mcrouter-memcached-0 1/1 Running 0 116s
mcrouter-memcached-1 1/1 Running 0 105s
mcrouter-operator-59ccfcb8d9-t5vdr 2/2 Running 0 3m19s
telnet 0/1 CreateContainerError 0 85s
$ oc logs pods/telnet
Error from server (BadRequest): container "telnet" in pod "telnet" is waiting to start: CreateContainerError
$ oc describe pods/telnet
Name: telnet
Namespace: mcrouter
Priority: 0
Node: crc-rtgqw-master-0/192.168.126.11
Start Time: Sat, 05 Sep 2020 09:51:44 +0000
Labels: run=telnet
Annotations: k8s.v1.cni.cncf.io/networks-status:
[{
"name": "openshift-sdn",
"interface": "eth0",
"ips": [
"10.128.0.69"
],
"dns": {},
"default-route": [
"10.128.0.1"
]
}]
openshift.io/scc: anyuid
Status: Pending
IP: 10.128.0.69
IPs:
IP: 10.128.0.69
Containers:
telnet:
Container ID:
Image: jess/telnet
Image ID:
Port: <none>
Host Port: <none>
Args:
mcrouter
5000
State: Waiting
Reason: CreateContainerError
Ready: False
Restart Count: 0
Environment: <none>
Mounts:
/var/run/secrets/kubernetes.io/serviceaccount from default-token-vk2lg (ro)
Conditions:
Type Status
Initialized True
Ready False
ContainersReady False
PodScheduled True
Volumes:
default-token-vk2lg:
Type: Secret (a volume populated by a Secret)
SecretName: default-token-vk2lg
Optional: false
QoS Class: BestEffort
Node-Selectors: <none>
Tolerations: node.kubernetes.io/not-ready:NoExecute for 300s
node.kubernetes.io/unreachable:NoExecute for 300s
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Normal Scheduled <unknown> default-scheduler Successfully assigned mcrouter/telnet to crc-rtgqw-master-0
Warning Failed 101s kubelet, crc-rtgqw-master-0 Error: container create failed: time="2020-09-05T09:51:51Z" level=error msg="container_linux.go:349: starting container process caused \"exec: \\\"telnet\\\": executable file not found in $PATH\""
container_linux.go:349: starting container process caused "exec: \"telnet\": executable file not found in $PATH"
Warning Failed 98s kubelet, crc-rtgqw-master-0 Error: container create failed: time="2020-09-05T09:51:55Z" level=error msg="container_linux.go:349: starting container process caused \"exec: \\\"telnet\\\": executable file not found in $PATH\""
container_linux.go:349: starting container process caused "exec: \"telnet\": executable file not found in $PATH"
Warning Failed 81s kubelet, crc-rtgqw-master-0 Error: container create failed: time="2020-09-05T09:52:12Z" level=error msg="container_linux.go:349: starting container process caused \"exec: \\\"telnet\\\": executable file not found in $PATH\""
container_linux.go:349: starting container process caused "exec: \"telnet\": executable file not found in $PATH"
Warning Failed 65s kubelet, crc-rtgqw-master-0 Error: container create failed: time="2020-09-05T09:52:28Z" level=error msg="container_linux.go:349: starting container process caused \"exec: \\\"telnet\\\": executable file not found in $PATH\""
container_linux.go:349: starting container process caused "exec: \"telnet\": executable file not found in $PATH"
Warning Failed 49s kubelet, crc-rtgqw-master-0 Error: container create failed: time="2020-09-05T09:52:44Z" level=error msg="container_linux.go:349: starting container process caused \"exec: \\\"telnet\\\": executable file not found in $PATH\""
container_linux.go:349: starting container process caused "exec: \"telnet\": executable file not found in $PATH"
Warning Failed 32s kubelet, crc-rtgqw-master-0 Error: container create failed: time="2020-09-05T09:53:01Z" level=error msg="container_linux.go:349: starting container process caused \"exec: \\\"telnet\\\": executable file not found in $PATH\""
container_linux.go:349: starting container process caused "exec: \"telnet\": executable file not found in $PATH"
Normal Pulled 15s (x7 over 102s) kubelet, crc-rtgqw-master-0 Successfully pulled image "jess/telnet"
Warning Failed 15s kubelet, crc-rtgqw-master-0 Error: container create failed: time="2020-09-05T09:53:18Z" level=error msg="container_linux.go:349: starting container process caused \"exec: \\\"telnet\\\": executable file not found in $PATH\""
container_linux.go:349: starting container process caused "exec: \"telnet\": executable file not found in $PATH"
Normal Pulling 2s (x8 over 107s) kubelet, crc-rtgqw-master-0 Pulling image "jess/telnet"
$
Here is my work-around, using my own image:
$ oc delete pods/telnet
pod "telnet" deleted
$ oc run -it --rm telnet --image=troppens/telnet --restart=Never mcrouter 5000
If you don't see a command prompt, try pressing enter.
set ansible 0 0 8
operator
STORED
get ansible
VALUE ansible 0 8
operator
END
quit
Connection closed by foreign host
pod "telnet" deleted
pod mcrouter/telnet terminated (Error)
$
My image is available on Docker Hub and Quay. I am happy to provide a pull request using my image, if this is of interest for the community.
The text was updated successfully, but these errors were encountered:
The module uses the
jess/telnet
image which is broken: jessfraz/dockerfiles#547The impact for this learning module is that the command hangs forever and for OpenShift beginners it is hard to identify the root cause and to find a workaround.
I resolved it for myself by creating my own container image as described in the issue above.
The following command is hanging forever:
Here are the logs:
Here is my work-around, using my own image:
My image is available on Docker Hub and Quay. I am happy to provide a pull request using my image, if this is of interest for the community.
The text was updated successfully, but these errors were encountered: